Navigation

    Inedo Community Forums

    Forums

    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    1. Home
    2. stevedennis
    3. Posts
    • Profile
    • Following
    • Followers
    • Topics
    • Posts
    • Best
    • Groups

    Posts made by stevedennis

    • RE: Support for NotAutomatic/ButAutomaticUpgrades headers in Debian feed Release files

      Hi @geraldizo_0690,

      Thanks for the pointers -- now as an FYI, these settings would have to be a Feed-level setting, but the drop-downs would be the same.

      FYI, here's the code we use to generate the Release file --- I'm not sure what those other header values do, but we probably wouold just want to add the two you suggested.

      What do you think?

      I suspect this will be a quick, opt-in change!

      private void WriteReleaseFile(Stream output)
      {
          using var writer = new StreamWriter(output, InedoLib.UTF8Encoding, leaveOpen: true) { NewLine = "\n" };
          writer.WriteLine($"Suite: {this.Distro}");
          writer.WriteLine($"Codename: {this.Distro}");
          writer.WriteLine(FormattableString.Invariant($"Date: {this.Generated:ddd', 'dd' 'MMM' 'yyyy' 'HH':'mm':'ss' UTC'}"));
      // NotAutomatic: yes  <-- add here
      // ButAutomaticUpgrades: yes <-- add here
          writer.WriteLine($"Architectures: {string.Join(' ', this.indexes.Select(i => i.Architecture).Distinct(StringComparer.OrdinalIgnoreCase))}");
          writer.WriteLine($"Components: {string.Join(' ', this.indexes.Select(i => i.Component).Distinct(StringComparer.OrdinalIgnoreCase))}");
      
          var desc = FeedCache.GetFeed(this.feedId)?.Feed_Description;
          if (!string.IsNullOrWhiteSpace(desc))
              writer.WriteLine($"Description: {desc.ReplaceLineEndings(" ")}");
      
          writeHashes("MD5Sum:", i => i.MD5);
          writeHashes("SHA1:", i => i.SHA1);
          writeHashes("SHA256:", i => i.SHA256);
          writeHashes("SHA512:", i => i.SHA512);
      
          void writeHashes(string name, Func<IndexHashData, byte[]> getHash)
          {
              writer.WriteLine(name);
              foreach (var i in this.indexes)
              {
                  writer.WriteLine($" {Convert.ToHexString(getHash(i.Uncompressed)).ToLowerInvariant()} {i.Uncompressed.Length,16} {i.Component}/binary-{i.Architecture}/Packages")
                  writer.WriteLine($" {Convert.ToHexString(getHash(i.GZip)).ToLowerInvariant()} {i.GZip.Length,16} {i.Component}/binary-{i.Architecture}/Packages.gz");
              }
          }
      }
      
      posted in Support
      stevedennis
      stevedennis
    • RE: Bitbucket authentication issues

      Hi @brandon_owensby_2976 ,

      It sounds like you're on the right rack with troubleshooting; the issue is definitely on the server-side in this case, so I asked ChatGPT. Who knows if any of this is accurate, but...

      This is a very common situation with older versions of Bitbucket Server (especially pre-6.x / pre-7.x era, but even up to some 7.x versions in certain setups).

      The REST API (e.g. /rest/api/1.0/...) and the Git Smart HTTP protocol (/scm/.../info/refs, /git-upload-pack, etc.) are handled by different authentication filters in Bitbucket Server.

      Most likely you're using a Personal Access Token / HTTP Access Token (most frequent cause in older versions). In many Bitbucket Server versions (especially ≤ 7.17–7.21), HTTP access tokens were designed mainly for REST API and did not work reliably (or at all) for Git over HTTPS in many cases.

      As a workaround , you need to use a real username + password (or username + app password if 2FA is on) for Git operations

      We've seen similar in really old version of ADO, GitHub, etc, where API tokens wouldn't work for Git.

      Anyway, I would try that - at least from the curl side of things. And maybe upgrading will help as well. If it works, then you'll likely only be able to use a Generic Git repository with a real username/password -- and just create a special builds user which effectiveely acts like an APi key.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Suggestion: Show total container image size

      Hi @Stephen-Schaff,

      It seems pretty easy to add these up and display them on the screen! I suppose the "hard part" is the UI...

      A "Total" line doesn't seem to look right. And it seems like too little information to put in one of those info-boxes. "Total Size: XXXX MB" at the bottom just looks incomplete.

      Any suggestions? I'm struggling a bit to see how it could be displayed without looking a little out of place... and since it was your idea I figured I'd ask ;)

      Thanks,

      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Support for NotAutomatic/ButAutomaticUpgrades headers in Debian feed Release files

      Hi @geraldizo_0690,

      Hello,

      I'm not all that familiar with Debian/APT... but I briefly researched this, and it seems like this involves adding values like this at the top of the Release file like this:

      NotAutomatic: yes 
      ButAutomaticUpgrades: yes
      

      Is that it really? And this setting would impact the entire feed... but have no real relation/impact to connectors or packages?

      If that's the case, how would you envision configuring this? I'm thinking on the Feed Properties page, but perhaps as a checkbox? How do other products/tools do it in your experience?

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Bitbucket authentication issues

      Hi @brandon_owensby_2976 ,

      Thanks for the feedback. Based on what you described, it sounds like...

      • BitBucket Server API is working fine
      • Git API is not working due to a failed/failing authentication challenge

      You were able to confirm this with the "Generic Git Repository" also not working. If you were to do a curl -I -u USERNAME;APIKEY https:/.../.git you would most certainly get a 401 response as well.

      Anyway that's where I would start -- try to figure out why the Git API is not accepting the credentials. It's most likely related to permissions on theh key, but it's really hard to say... just a guess.

      Thankks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Migrate Feed from C drive to another drive

      Hi @Julian-huebner_9077 ,

      Here's some information on file storage paths:
      https://docs.inedo.com/docs/proget/feeds/feed-overview/proget-feed-storage

      Long story short, if you modify Storage.PackagesRootPath under Admin > Advanced Settings and move your files as needed, then it should work just fine.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: BuildMaster fails to return TeamCity build configs

      Hi @kquinn_2909 ,

      We haven't forgotten about this; the issue is trying to figure out steps to reproduce it based on the information we have... considering it works on our test instance and all. We may consider putting some more debugging code in, though figuring out how to expose that in this context is a little challenging.

      Just as a sanity check though, do you have a project that doesn't have a "space" in the name? I want to make sure this isn't something really simple as WebProjects%20Replicator vs WebProjects_Replicator.

      The other idea is authentication/authorization, though I would imagine you would get an error accessing the project instead of no builds.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: [ProGet] Recently Published Rule

      Hi @jstrassburg_8563,

      if the resolved version that npm i underscore chose was released in the blocking period, the npm command would 400?

      If you have "Block Noncompliant Packages" enabled (which we generally don't recommend) and you have a rule that new packages are complaint, then the npm command would most certainly give some kind error.

      You will probably see a 400 code, but I don't think it will display the message that's sent by ProGet (i.e. "package blocked due to...")? The real issue comes with a large dependency tree, and it'll be hard to know what exactly the issue is.

      As such, we recommend running pgutil builds scan/audit in your CI/CD pipelines instead of blocking. This will produce a much easier to understand report, and even allow you to bypass issues reported on a case-by-case basis.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Helm connector with relative url

      Hi @sigurd-hansen_7559 ,

      Thanks for pointing me to the repository; I was able to reproduce this, and it will be fixed via PG-3194 in the next maintenance release (scheduled for Friday).

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Not able to download Docker images that doesn't have / in it

      Hi @toseb82171_2602,

      In Docker, images must have a namespace. When they don't, the Docker client will transparently append library/ to those namespaces. In general, this behavior is not desirable, and it's recommended to use library/python or, in a ProGet context, myproget.corp/mydockerfeed/library/python.

      That said, there is a setting on the connector that may help resolves such images, but these images can be problematic even with that.

      AS for extending the trial, no problem - you can actually do this yourself on my.inedo.com, on the day of expiry. Of course, please contact us if you run into any issues or have licensing questions.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: How do i clone a feed (packages) into a new feed?

      Hi @nachtmahr,

      Yes, that's what I would reommend doing -- using the internal storage path. Just make sure to NOT select the option to delete the files and make sure to select "search subdirectories" so everything will be imported.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: How do i clone a feed (packages) into a new feed?

      Hi @nachtmahr ,

      There is no "clone" method per se, but you can accomplish this by bulk-importing packages into a new feed: https://docs.inedo.com/docs/proget/feeds/feed-overview/proget-bulk-import

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: [ProGet] Recently Published Rule

      Hi @tim-vanryzin_8423 ,

      Great question on developer experience! We're very curious to learn that ourselves, so please let us know as you implement it.

      First and foremost, if you haven't already, check out the Recently Published & Aged Packages rules blog article to see how this works and our current advice. FYI - we are likely going to change the best practices guidance in 2026 to discourage download blocking.

      From an API/technical standpoint, it's simply not possible to "hide" the fact that 1.12.15 is the latest version. So, if you have a connector, ProGet will report the latest version as reported by the connector.

      But even it were technically possible, there's no simply great developer experience here. Keep in mind that most never look at the ProGet ui -- they configure things once, and forgeet about it.

      So really, it's just a question of when you want the developer to find out they can't use Xyz-1.12.15. Here are the general options:

      • Manual Curation (no connectors) - it's always going to be on nuget.org / npmjs.org, so even if you manually curate every page in ProGet, they'll know it exists and will be confused why it's not on ProGet; they may not even know who to ask
      • Download blocking - you could simply block downloads of newest packages, but that's just going to look like random "400 errors" to developers and they will become frustrated
      • Build Auditing - run pgutil builds audit on your build server, and you can see if the packages are noncompliant; this is where we are shifting our advice, as a failed build at that stage will be so much more obvious than a 400 buried in a package restore step

      Ultimately this requires training developers to use lock files and not always get latest. That's why we are shifting to pgutil builds audit -- it's almost self-training. When their builds fail, they will see the reason clearly and should be able to adjust their code/configuration to not use a non-compliant version.

      -- Dean

      posted in Support
      stevedennis
      stevedennis
    • RE: [ProGet] How do I specify Storage.PackagesRootPath in configuration?

      Hi @jonathan-werder_8656 ,

      Thanks for clarifying! I'll be honest, I had no idea where you were getting configuration files in the first place... and I forgot that was ever a thing. Sorry about that.

      Anyway, I don't think that this has worked in years, and it's most definitely not something we'd recommend today. We'll try to track it down and remove it in the docs / help text.

      For your use case (automated installation) just use pgutil settings to set that value.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Support for RPM module streams?

      Hi @henderkes,

      I'm not familiar with module streams, but if you tried it in ProGet and it didn't work then likely not? It sounds like a different API/endpoint, but I haven't researched it at all.

      However, based on how you described it ("removed from Fedora 39"), it sounds like one of those "good but old" technologies that don't make sense for us to implement.

      Thanks,
      Alana

      posted in Support
      stevedennis
      stevedennis
    • RE: [ProGet] How do I specify Storage.PackagesRootPath in configuration?

      Hi @jonathan-werder_8656 ,

      You can set this value under Admin > Advanced Settings.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Connector for accessing Jfrog pypi feed

      Hi @pmsensi ,

      You'd need to run through a proxy like Fiddler or ProxyMan, which can capture the outbound / inbound traffic. Or if you provide us with a URL/API key we might be able to play around and attach a debugger. You could always create a free trial or something and set it up that way.

      First thing that comes to mind is that your endpoint is incorrect and you may need to "play around" with the URL. Typically it's like this:

      • endpoint is https://server.corp/path/my-pypi
      • index file is https://server.corp/path/my-pypi/simple
      • ProGet automatically appends /simple unless you specify otherwise on Advanced

      The "Simple " endpoint is just a list of HTML like <a href="...">. That's what ProGet/connectors/pypi follow.

      Thanks,

      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: PHP Composer feed connect to vcs type repository or how to upload

      Hi @ayatsenko_3635, @dubrsl_1715,

      Thanks for the feedback and continued discussion.

      So, our Composer feed is relatively new, so we are open to exploring new ideas. One option might be to do an "import" of packages into ProGet, similar to how we handle connectors in Terraform feeds. But that's something that could also be done with a script, so we'd want to see what that looks like as a prototype.

      That said, we definitely can't implement "content pointers" to Git repositories. The Git-based "package" model is not only outdated but it has several "fatal" Software Supply Chain problems.

      The biggest problem, by far, is that package content is hosted by a third party (i.e. GitHub) and managed by another third party (i.e. the repository owner). At any time, a package author or the host can simply delete the repository and cause a major downstream impact - like the infamous left-pad incident in npm.

      This is why other package ecosystems have adopted a "read only" package repositories and disallow deletes (except for rare cases like abuse). Once you upload a package to npmjs, nuget, rubygems, etc. -- it's permanently there, and users can always rely on that being the case.

      That's simply not possible with Git-based "packages". The "content pointer" must be updated periodically updated, such as if the author decides to move the Git repo to GitLab, Gitea, etc. Now you no longer have a read-only package repository, but one that must be editable. Who edits? Are they tracked? What edits are allowed? Etc.

      There are several of other issues, especially with private/organizational usage, like the fact that there's no reasonable way to QA/test packages (i.e. compared to using a pre-release/repackaging workflow) and that committing/publishing are coupled (i.e. you tag a commit to publish it). This makes governance impractical.

      And that's not to mention the fact that there's no real way to "cache" or "privatize" third-party remote Git repositories. Unless, of course, you mirror them into your own Git server... which is technically challenging, especially if the author changes hosts.

      We first investigated feeds years ago, but they didn't have package files at the time -- only Git repository pointers. Like Rust/Cargo (which used to be Git-based, and still technically supports Git-based packages), we anticipate this same maturity coming to Packagist/Composer as well.

      So while we certainly understand that this is a workflow shift, it's a natural evolution/maturation of package management. Likewise, it took decades for the development world to shift from "file shares" and legacy source control to "Git", but that's what everyone has standardized on.

      That's the world ProGet operates in, so it might be a bit "too far ahead", or we might be misreading the tea leaves - but a "package mindset" is a requirement in using ProGet. But hoepfully we can make that easier, possibly with exploring "package imports" or something like that.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Known licenses are shown as unknown

      Hi @frank-benson_4606 ,

      Thanks for clarifying, that makes sense.

      I'm afraid that ProGet does not "crawl" the parent artifacts for metadata; we had considered it, but it's rather challenging to do from an engineering standpoint, difficult to present crawler errors, and fairly uncommon.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: PHP Composer feed connect to vcs type repository or how to upload

      Hi @dubrsl_1715 ,

      Thanks for the clear explanation; I'm afraid your best bet here is to "take the plunge" and adopt package management best practices. A key tenet being a "self-contained archive file with a manifest" that is read-only and "cryptographically sealed" (i.e. hashed).

      A "pointer to a Git commit" seems convenient at first, but there's a good reason that many ecosystems (including Composer) have moved away from it -- ultimately it's just not scalable and doesn't handle team workflows very effectively.

      This will likely involve updating your composer.lock files and also the way you publish packages. In general, your process for creating a new version of a PHP packages should look something like:

      1. Checkout code
      2. Modify composer.json file with a version number
      3. Package the file
      4. Publish to ProGet

      As you get more mature, you can get into pre-release versioning and repackaging.

      Hope that helps,

      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Known licenses are shown as unknown

      Hi @frank-benson_4606,

      I looked into this a bit closer now.

      Looking at the commons-io-2.14.0.pom, there is no Licenses element specified. The pom should have that, and it'd be nice if the package authors added it; if you requested that via a pull request or issue in their github, I'm sure they would. That said, that's why it's not showing in ProGet.

      This is why you see the unknown license detected, and that means you have to click "Assign License Type to Package" for ProGet to associate the package/license. I assume that you did that on 2.14.0, and selected Apache-2.0.

      By default, that selection only applies to the specific version, and if you wanted it to apply to all versions of commons-io (including future ones not yet published) you'd need to click on the "Apply to all versions".

      If you navigate to SCA > Licenses, and click on Apache-2.0, you can see the assignment to the package under the "Purls" tab. It would show: pkg:maven/commons-io/commons-io@2.14.0 for the version you selected.

      You will need to either do this for all versions or decide if you want to add an entry to the Package Name tab (i.e. pkg:maven/commons-io/commons-io) under the Apache-2.0 license definition.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet encryption key decryption failure

      Hi @sneh-patel_0294 ,

      What I mean is, in your browser, open multiple tabs -- one for /administration/cluster on each node in the cluster, bypassing the load balancer. All nodes should show "green" for that.

      The one that shows "red" still has the wrong encryption key. Modify the encryption key, and restart the servicies, and reload the tab, and it should work fine.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet encryption key decryption failure

      @sneh-patel_0294 to restart the services, you can do so from the Inedo Hub or the Windows Services (look for INEDOPROGETSVC and INEDOPROGETWEBSVC). If you're still using IIS, make sure to restart the app pool as well

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet encryption key decryption failure

      Hi @sneh-patel_0294 ,

      This message means that the decryption keys across machines are different, which result in exactly the behavior you describe (403s, logouts):
      https://docs.inedo.com/docs/installation/configuration-files

      I know you mentioned you already checked, so there's likely a typo, miscopy, looking at the wrong folder, etc. Note that the folder is %PROGRAMDATA%\Inedo\SharedConfig\ProGet.config, as opposed to C:\ProgramData\Inedo\... - on some machines, the program data folder is stored in a different location.

      I would also make sure to restart the service/web as well. To test, you can try loading that tpage on all nodes and you should not see "Encryption key decryption failure" when refreshing the nodes.

      Hope that helps,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Known licenses are shown as unknown

      Hi @frank-benson_4606 ,

      This appears to be a known issue that will be fixed in 2025.15, releasing this Friday, that causes certain URL-based licenses to not be detected (PG-3153).

      If you're using Docker, you can try upgrading to inedo/proget:25.0.15-ci.4, which should have that fix in it.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet 2025.10: License Update API Issues

      @jw good call, we'll get this fixed via in this week's release via PG-3161

      posted in Support
      stevedennis
      stevedennis
    • RE: 'Usage & Statistics' info missing

      Hi @k-lis_1147,

      Sorry on the slow reply; we did not get a chance to investigate in last release, but it was on the list this week. That being said, it was also an easy fix (didn't anticipate it to be a copy/paste fix)-- and we'll get it via PG-3160 in this week's mainteancne release.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: How to create a license attribution report?

      Hi @frank-benson_4606,

      Whoops, it looks like that was kept in the documentation by mistake; I just removed it now.

      We had planned that feature way back in ProGet 2023, but it never was implemented. That feature -- as well as some of the more advanced license compliance ideas we had -- have since left our roadmap due to a total lack of interest from end-users.

      The main reason for lack of interest is that the pgutil builds audit lists all packages and licenses. So, most users found that to be sufficient. They just hand that list to the legal team, who creates that amendment. So perhaps that will suffice in your use-case as well.

      Let us know if not, always open to hearing more.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Known licenses are shown as unknown

      Hi @frank-benson_4606,

      ProGet's license detection requires generally that a package is cached or local to ProGet in order to detect the license. When you visit the package page, a request is being made to download the metadata from the remote connector, which is how you can see the license in that case.

      That being said:

      • you can enable OSS Metadata Caching, which will perform these requests on remote packages -- but it's obviously a performance hit
      • there is a known bug (fixed in 2025.15, releasing Friday) that causes certain URL-based licenses to not be detected (PG-3153)

      Hope that helps to troubleshoot. A prerelease version of 2025.15 is vailable should be interested

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Reporting and SCA

      Hi @rick-kramer_9238,

      That vulnerability is in our database as PGV-2228003, and it shows up when I view that package:

      cb5a7fbf-48ef-4d73-9a94-09162c0a1992-image.png

      If you can provide more details about what you mean by " the report is telling us no vulnerabilities are detected" I can investigate further.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Remote NuGet package cached after unlisting

      Hi @yaakov-smith_7984 ,

      This behavior is expected and by design. "Deprecation" and "Unlisted" are server-side metadata (i.e. stored in the remote repository, not the package itself), and once a package is brought into to a different server (i.e. ProGet), it's "disconnected" from the other server.

      That being said, there is a feature in ProGet that can routinely "sync" this server-side metadata:

      • https://docs.inedo.com/docs/proget/sca/howto-deprecated-package-alerts
      • https://docs.inedo.com/docs/proget/sca/policies#oss-metadata-updating-caching

      This feature obviously comes with some performance costs, though you'd really have to enable it to see if that has any impact on operation.

      Another approach is to use a retention policy that deletes cached packages older than 90 days.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Upload to Debian Feed fails with "Package does not have a control file."

      Hi @frei_zs,

      Based on the fact that unpackaging/repackaging it works, there's definitely something "wrong" with the original package file.

      Debian uses a tarfile format, and there are several "buggy" tarfile writers that don't get the format quite write. If I remember correctly, some ancient versions of dpkg wrote these files incorrectly. Some tarfile readers account for these errors while others (perhaps including ours) do not.

      We may be able to attach the file to a debugger and give more details, but if this is a one-off or rare circumstance, than I would just repackage it and not worry about it.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet 2025.14: Vulnerability Database Updater causes duplicates in PackageNameIds

      Hi @jw ,

      Did you try this on a new instance, or did you discover this on your (older) instance?

      This was a known issue through several versions of ProGet 2025, and it impacts mostly SCA as you noticed. However, the vuln updater has since been fixed, so it shouldn't be continuing.

      The "feed reindex" function can also merge/fix these duplicate names. They should be detected during a "feed integrity" check, and show as a "warning".

      Thanks,

      posted in Support
      stevedennis
      stevedennis
    • RE: Question on Single-Instance Performance (Docker Deployment with Embedded PostgreSQL)

      Hi @koksime-yap_5909 ,

      I'm afraid we can't provide much clearer guidance than that, as there are so many factors involved that make predicting performance basically impossible. For example, the feed types you're using, your CI server configuration, how often developers are rebuilding, etc.

      The article you found is actually what we send users who experience symptoms of server overload, to help understand where it comes from and how to prevent it. As the article mentions, the biggest bottleneck is network traffic during peak traffic - there's only so much that a single network card can handle, and scaling CPU/RAM doesn't really help.

      This is where load-balancing comes in. The main downside is complexity/cost, which is why a most customers start with a single instance. It can take quite a while for a tool like ProGet to be fully onboarded across teams, so performance problems likely won't happen at first.

      Hope that helps, let us know if you have any other questions!

      Thanks,

      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Clarification on Retention Rules and Recently Created Files Being Deleted

      Hi @koksime-yap_5909 ,

      Good catch; that is most definitely a bug. I just checked, and it's isolated to assets - packages and Docker images work as expected.

      This will be fixed in the upcoming maintenance release via PG-3150; it's shiping Friday, but we can provide a pre-release if you're interested in testing earlier.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Clarification on Retention Rules and Recently Created Files Being Deleted

      Hi @koksime-yap_5909,

      In the event that the artifact has not been downloaded (i.e. the last download date is "null"), then the publish date will be considered. So if you set "90 days", then an artifact will be deleted at earliest, 90 days from publication if it hasn't been downloaded.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Lost Administrator Rights — How to Restore Admin Access?

      Hi @koksime-yap_5909,

      The command will recreate the user, restore administrative privileges, etc. It's safe to run - and you'll ultimately be left with a Admin/Admin user that you can log-in as.

      On ProGet 2025, the command is proget or proget.exe We should update the docs for sure

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Rocky Linux rpm feed not working

      Hi @Sigve-opedal_6476 ,

      There are some known issues that we intend to fix with PG-3144 in the next maintenance release (scheduled for Friday). This will likely be resolved then.

      The inedo/proget:25.0.14-ci.10 container should have these changes inthem, if you'd like to try it out sooner.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: inedoxpack error: No extensions were found...

      @yakobseval_2238 thanks for letting us know, I just updated it!

      posted in Support
      stevedennis
      stevedennis
    • RE: 'Usage & Statistics' info missing

      Hi @k-lis_1147,

      Based on what you described, it should show up.

      Can you confirm what feed type you're using, and whether or not you're using PostgreSQL (this is the default for ProGet 2025).

      I just discovered a bug (PG-3145) that would impact PostgreSQL (all feeds probably) and certain feed types on SQL Server (Maven) that would cause that information not to display on that page.

      Easy fix, but just want to double-check

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Lost Administrator Rights — How to Restore Admin Access?

      Hi @koksime-yap_5909 ,

      If you ever get "locked out" of an Inedo product, either due to misconfiguration or a lost password, you can restore the default Admin/Admin account and reenable Built-in User Sign-on by using ProGet.exe resetadminpassword

      Here's more information on this procedure:
      https://docs.inedo.com/docs/installation/security-ldap-active-directory/various-ldap-troubleshooting

      Thanks,

      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Mark private Nuget/Npm Packages as Vulnerable?

      Hi @tayl7973_1825 ,

      Thanks for the feedback; this is all a relatively new space, so we're in the process of building best practices / advice as well as tools to help teams solve these problems.

      Right now, based on your suggestion, it sounds like the workflow would require us to manually identify which applications depend on a vulnerable library, notify each owning team

      You are correct - the SCA Builds & Projects functionality is designed to "provide that link" between specific package versions and specific builds of applications. The builds are a moving target, as they may or may not be active/deployed.

      The "Project" in ProGet is not intended to the "source of truth" about the project itself, but be sort of sync'd with the truth (e.g. like an Application in BuildMaster). That's why there's a "stages timeline" for builds in PRoGet.

      hope it fits within their priorities, and then track remediation through individual tickets.

      Our advice here is to think of it more like, "advise them of the identified security risk and unavailability of the impacted library they are using". Ultimately it should be up to the team (their product owner) to evaluate the risk you identified and mitigate it. For example, TeamLunchDecider1000 can probably live with a security risk, but let the team decide.

      Once you've removed the library from ProGet, they can't use it anymore and it's "no longer your problem" to worry about or track through tickets.

      Ideally, we were hoping our package management system — since it already governs distribution and security controls — could act as that “one stop shop” to track and visualize which applications still rely on a vulnerable version along side it's assigned severity rating.

      ProGet already provides visibility into consumers through SCA, and you can already see how OSS Vulnerabilities impact builds.

      HOWEVER, our core advice here is to not try to establish your own in-house "vulnerability database" for in-house libraries your organization. Even large orgs (2000+ developers) won't do that.

      Instead, it's a simple binary decision: PULL or KEEP the library. If you PULL, then notify consumers it's unavailable going forward and let them decide how to mitigate.

      That approach is superior to OSS Vulnerability workflows, but it's obviously not possible for OSS library authors to do.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Mark private Nuget/Npm Packages as Vulnerable?

      Hi @tayl7973_1825 ,

      Thanks for clarifying it. Based on that, I would say that "Vulnerabilities" are most definitely the wrong tool for the job. You can certainly "hammer in a screw" but there's a better way to do it - and we don't make "screw hammers" at Inedo 😉

      We're working on best practices / guidance on how to build security policies around these topics, but I'll try to give some tips now.

      What you're describing isn't a vulnerability per se, but a SAST Issue: a potential weakness in code detected by a static analysis security tool. Most of these are false positives and present no real security risks, but some are.

      If you discover a SAST Issue in one of your libraries, then you should use the following process:

      1. Evaluate if it's a false positive or not
      2. Unpublish the library internally if there's a security risk
      3. Enumerate the consumers (i.e. applications in flight or deployed to production)
      4. Evaluate the security risk (low, high), based on the consumers/usage
      5. Notify the application teams to upgrade the library as appropriate

      Note how this process is fundamentally different than OSS packages / vulnerability workflows:

      • you can unpublish/block packages from your repository
      • you know which applications are consuming your packages
      • you know which teams maintain which applications
      • you can work with those teams to assess the risks

      Bottom line: if a package causes a real security risk, then unpublish it and fix the consuming applications as appropriate. Otherwise, don't.

      There's really no middle ground or room in this process for "Vulnerabilities" - and trying to curate an internal "vulnerability database" is just going to make things less secure in your organization.

      That's a theme in our upcoming content, but the general idea is when you treat all issues/vulnerabilities as security risks, then it's impossible to focus on the ones that are actual risks -- and it's as meaningless as saying "everything is a top priority".

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Mark private Nuget/Npm Packages as Vulnerable?

      Hi @tayl7973_1825 ,

      This is not possible nor is it a workflow we'd recommend to support. Vulnerabilities have a very specific meaning / use case -- third-party discoveries in open-source packages that may impact your code (but probably won't) -- and it's not a good idea to "abuse" them for other purposes.

      Deprecated is one solution, but a better would be to use SCA and monitor how that package is being used, so you can understand impact on library consumers:
      https://docs.inedo.com/docs/proget/sca/builds

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: inedoxpack error: No extensions were found...

      Hi @yakobseval_2238 ,

      Can you let us know the commands/arguments you're using?

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Apply license key inside container

      Hi @jlarionov_2030,

      I haven't tested or tried it, but I can't help but wonder the API is responding with some kind of "license required" error, and blocking the seting.

      I suppose we could investigate and try to resolve the error, but automated setup with a license key isn't so common of a requirement.... if tis is not really something you will do that often, perhaps it's not worth the effort.

      Let us know your thoughts.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Apply license key inside container

      Hi @jlarionov_2030 ,

      As of ProGet 2023 (or maybe earlier?), license keys are no longer requested / entered at installation time, but in the software itself now. This only matters on new instances.

      You can use pgutil settings to set a license key if you'd like.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Not able to upload .spd files to proget assets

      Hi @parthu-reddy,

      Thanks for discovering/confirming that; unfortunately we're not able to reproduce this issue, as the multi-part / chunked uploads already take into account multiple servers.

      • Chunked upload sessions are persisted in the shared database (ChunkedUploads table)
      • Bytes are appended to a file stored in shared store

      Would you be able to dig into the request patterns a little more? I suspect there's "something" configured on the load-balancer that's "doing something weird" with these ranged requests.

      The Multipart Upload API explains what's happening behind the scenes, and you may find that using pgutil assets upload is easier to troubleshoot.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet 2025.10: License Update API Issues

      Hi @jw ,

      It definitely looks like there's some "drift" in the documentation and behavior. I already see some stuff about "allowedFeeds" in there, which isn't even a thing since policies. And you're right, you can't update code/title, which aligns with the pgutil behavior as well.

      We'd rather have "no documentation" than wrong documentation, any suggestions on what to delete from here? https://github.com/inedo/inedo-docs/blob/master/Content/proget/api/licenses/update.md

      Feel free to submit a PR if you've got a clear idea.

      In general, we want to make sure the pgutil docs are accurate (those are very easy to test), and we figure... someone can just look at the pgutil code to learn the HTTP endpoint or library.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: npm connector returns 400

      Hi @udi-moshe_0021 ,

      My guess is that your proxy server is blocking certain things or having issues with redirects; you'd really have to monitor the back-and-forth traffic to see what's going on.

      As you can see, there are a lot of "redirects" going on and URLS/Domains that you will not expect and can be hard to predict.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • 1
    • 2
    • 3
    • 4
    • 5
    • 9
    • 10
    • 1 / 10