Group Details Private

administrators

  • RE: OCI support?

    hi @jacob-kemme_1710 , thanks for all of the details.

    I'm pretty sure the Docker feeds already support OCI containers. They are basically identical from an API standpoint, and we've made a few tweaks here and there to accept different media types or parse a field in a manifest differently.

    I know for sure that vnd.oci.image.manifest.v1+json is already supported.. and if there's something missing, it should be trivial to add. I'd post that as a separate feature request, since it's likely a basic tweak.

    As for Helm... that's a different story. I don't think it's going to work, and it doesn't seem like a good fit for ProGet. Nor is a "generic OCI registry", which is why I'm hesitant to even consider the feature.

    The main issue I have is that an OCI registry is tied to a hostname, not to a URL. This is not what users expect or want with ProGet -- we have feeds. Users want to proxy public content, promotion content across feeds, etc. None of this is possible in an OCI registry.

    We got Docker working as a feed by "hijacking" the repository namespace to contain a feed name. Helm charts don't have namespaces, so this is a no go.

    We got Terraform feeds working by requiring some stupid prefix on the module name. There's just no other way to do it.

    I don't think that would be smart to do with Helm. And obviously we couldn't do that for "generic OCI" content since it's just meaningless blobs/binaries.

    I'm open to learning more... but my initial take is that if users don't find values in "feeds" (well-defined content types, permissions, segmentation, replication, etc), and just want a "generic place to shove random content stuff" then ProGet isn't the tool?

    Alex

    posted in Support
  • RE: Creating Users with Native API

    Hi @jipianu_mihnea_1277,

    We don't document but if @steviecoaster recently ffigured this out and published a pretty cool library: https://github.com/steviecoaster/InedoOps

    It may do what you already need, so I'd check that out!

    But if not you should be able to find the answers in
    https://github.com/steviecoaster/InedoOps/blob/main/source/public/Security/Users/Set-ProGetUserPassword.ps1

    As a note, the Users_CreateOrUpdateUser is ust a stored procedure, so you could also peek at the code to see what it's doing behind the scenes. Groups is just <groups><group>A</group></groups>

    posted in Support
  • RE: Failed to fetch package tags for <local package name> from registry.npmjs.org.

    Hi @janne-aho_4082,

    This is most certainly related to heavy usage, even though it might not seem that at first; the connectors are basically a load multiplier. Every request to ProGet is forwarded to each connector via the network, and when you have self-connectors, that's a double-burdeon.

    Keep in mind that an npm restore will do thousands of simultaneous requests, often multiple for each package. This is to check latest version, vulnerabilities, etc. So you end up with more network traffic than a single server can handle - more ram/cpu will not help.

    This is most commonly seen as SQL Server connection issues, since SQL Server also uses network traffic. The best solution is to use network load balancing and multiple nodes.

    Otherwise, you have to reduce traffic. Splitting feeds may not help, because the client will then just hit all those feeds at the same time. The "connector metadata caching" can significantly reduce network traffic, but it comes at the cost of outdated packages. You may "see" a package on npmjs (or another feed), but the query is cached so it won't be available for minutes.

    Since you're on Linux, I would just use ngnix to throttle/rate limit ProGet. The problem is peak traffic, so start with like 200/request/max then go up from there.

    Cheers,
    Alana

    posted in Support
  • RE: Proget - Feature Request - Package Upload

    Hi @udi-moshe_0021 ,

    That's what we'd have to do... but it's not a trivial code change on our end. And then we'd have to test, document, and support it when it doesn't work as expected.

    So probably best to just use script to do the import.

    Thanks,
    Alana

    posted in Support
  • RE: npm feed + nodejs.org connector: huge memory footprint and npm install ECONNRESETs

    Hi @enrico-proget_8830,

    So we have fixed multiple issues with npm, specifically the npm connectors in the versions since you installed. I think upgrading to ProGet 2025.26 should resolve your issue with the ECONNRESET issue. Could you update to 2025.26 and rerun your test and let us know if you are still seeing the issue?

    Here is a link to the change list since your version of ProGet: https://my.inedo.com/downloads/upgrade?product=ProGet&fromVersion=2024.22

    Thanks,
    Rich

    posted in Support
  • RE: Proget - Feature Request - Package Upload

    hi @udi-moshe_0021,

    I believe that rpm feeds will support package upload, so you should be able to already use that in the latest version of ProGet 2024.

    However, I don't think we can add this feature for Debian. The reason is, when you upload a package to a Debian repository, you must specify a "component" it belongs to. This is not available/determinable from just the file name... it's server-side meatdata basically. So you'd have to use a script that pushes the packages.

    Cheers,
    Alana

    posted in Support
  • RE: ProGet: Auto package promotion from NuGet mirror?

    @dan-brown_0128 thanks for clarifying

    2500 is a lot more than I'm thinking... I was envisioning a handful of frequently-updated / highly-trusted packages (like AWSSDK, ). I think the UI would struggle a bit.

    Thinking about this further... a blanket "all future versions are OK" policy seems like it will cause more problems than benefits. That's a lot of decision-making burden to shift to developers, and they're just going to hit "update" the moment they're prompted.

    Best case scenario, that update delivers Zero Business Value... but there's a chance it'll introduce a regression, etc. I'd really need to study the data before recommending a practice like this (let adding a supported usecase for it) I'd be open to some sort of "automatic approval" feature, but I'd want to see some other factors in there, other than just "name matches X".

    As it stands, "package approval/promotion" isn't very widely used as it's solving a problem most organizations don't believe they have. Instead they drop $300K+ for Clownstrike or another security tool and believe it just "does everything for them". At that price, how could it not!

    So I think it's best to pin this for a bit. It's still a pretty "frontier" discussion/feature, and I don't want to "invent" something that's not going to get a lot of widespread use.

    posted in Support
  • RE: ProGet: Auto package promotion from NuGet mirror?

    Hi @dan-brown_0128, @scampbell_8969,

    We were reviewing this as part of our ProGet 2025+ roadmap, but I'm not sure if it makes sense to do. To summarize...

    "Once a package has been approved (at a name level), all subsequent versions are OK assuming there's no vulnerabilities or license issues. However, the current promotion model requires that we promote each and every version. By automating version promotion, it would allow developers access to newer versions of packages sooner, making access easier and devs will be more likely to upgrade."

    What about just using a connector with package filters in this case? For example:

    1. Create feed nuget-unapproved, nuget-approved
    2. Restrict promotion from nuget-unapproved -> nuget-approved
    3. Create connector NuGet.org-all and associate with nuget-unapproved
    4. Create connector NuGet.org-approved and associate with nuget-approved
    5. Edit connector filters on NuGet.org-approved to block everything except packages you want

    This would be possible today and would avoid adding a complex feature

    Thanks,
    Alex

    posted in Support
  • RE: Unable to upload files to asset directories

    Hi @layfield_8963 ,

    I would check under Admin > Diagnostic Center for errors, as well as your browser console.

    I would also use pgutil assets upload to work directly with the API and see if you can get any clues on what the underlying error is:
    https://docs.inedo.com/docs/proget/reference-api/proget-api-assets/file-endpoints/proget-api-assets-files-upload

    Most commonly, it's a antivirus/indexing tool that is locking/blocking the file, but the error message you see will help identify it further.

    Cheers,
    Alana

    posted in Support
  • RE: npm feed + nodejs.org connector: huge memory footprint and npm install ECONNRESETs

    Hi @enrico-proget_8830,

    Can you please tell me which version of ProGet you are running?

    Thanks,
    Rich

    posted in Support