Hi @inedo_1308 ,
This is likely a regression due to PG-3074 -- we'll take a look and report back!
Thanks for letting us know.
-- Dean
Hi @inedo_1308 ,
This is likely a regression due to PG-3074 -- we'll take a look and report back!
Thanks for letting us know.
-- Dean
When one of the connectors reports a 404
, ProGet should move on to the next connector to search. Can you help set up a reproduction case using the ProGet UI and URLS (perhaps some basic curl commands)? You should be able to do everything with basic GET requests, and see different results.
That'll make it a lot easier to test and see if we can reproduce the issue.
Thanks,
Steve
@yaakov-smith_7984 great news!
We are planning to include it as is in the next maintenance release, later today. So it's "stable enough" we figure :)
Hi @jim-borden_4965 ,
Thanks for the feedback! We will add a note to the migration wizard via PG-3080 that clarifies that mounting the volumes are required - there's no easy way for us to tell if the user did that.
Thanks,
Steve
Hi @arose_5538 , we will have this fixed in Friday's maintenance release via PG-3077
Thanks,
Steve
Hi @m-lee_3921 ,
Sounds like you're off to a great start with Universal Packages - that's exactly what we intended them for.
Great spot on the docs. In retrospect, using the word "deprecated" was incorrect - we should have said "legacy", for the exact reason you mentioned. We intended to add those capabilities to the common packages API when we wrote the docs, but priorities...
Anyway I clarified the docs:
Legacy Universal Feed Endpoints:
The following endpoints are duplicative of the Common Packages API endpoints and should be avoided when possible:
- List Universal Packages - lists specified packages
- List Universal Package Versions - describes versions of specified packages
- Delete Universal Package - deletes a specified package
- Download Universal Package - downloads a specified package
While we don't plan on removing them in the foreseeable future, they are considered legacy. Once the Common Packages API includes metadata queries, we will likely call these endpoints "deprecated".
It wouldn't be "too hard" to add metadata queries to common packages API now that everything has been refactored... but there hasn't been any real user demand for it. And it's already easy to do with upacks as you can see.
Anyway, when folks start asking for this from other package types, we'll add it to our roadmap. For now, "legacy" is good enough.
Thanks,
Steve
@yaakov-smith_7984 good news -- we're working on a new version of the Active Directory integration that may address this problem -- want to give it a shot?
It's in our pre-release feed:
https://docs.inedo.com/docs/proget/administration/extensions#pre-releases
The extension is InedoCore 3.0.5-CI.2:
https://proget.inedo.com/feeds/PrereleaseExtensions/inedox/InedoCore/3.0.5-CI.2
We have been running it in our environment for a while.
This would be an ideal place to fix the issue, or at least add some kind of option to make it work better.
Hi @fabrice-mejean_5174 ,
Thanks for the suggestion; to help move this forward, we thought about this from a technical standpoint. We're a little hesitant to call the command pgutil packages audit
since there's already an audit command for builds and vulnerabilities.
One idea is to deprecate pgutil vulnerabilities audit
and call it pgutil packages audit
instead.
In any case, this new command is a combination of pgutil builds audit and pgutil vulns audit.
The pgutil packages audit
command would input a project, something like this:
pgutil packages audit --project=c:\projects\MyProject.csproj
Behind the scenes, pgutil
would parse and POST a packageset:
POST /api/sca/audit-packages
[
{
"name": myPackage
"version": 1.2.3
"type": "nuget"
},
{
"name": myPackage
"version": 1.2.3
"type": "nuget"
}
]
The API would return an array I suppose (we don't have any samples for that, but it's a serialized BuildInfo.cs), and the end result would look like this:
$> pgutil packages audit --project=c:\projects\MyProject.csproj
Parsing MyProject.csproj... found X packages.
Azure.Core-1.35.0
Compliance : Compliant
License : MIT
Vulnerabilities : None
Microsoft.Extensions.Configuration.EnvironmentVariables 8.0.0
Compliance : Noncompliant
License : MIT, Apache-2.0
Vulnerabilities : PG-123456 (High)
the vulnerability title of this vulnerability goes here
The API could obviously contain more info. It's documented via serialized .cs classes here:
https://github.com/Inedo/pgutil/tree/thousand/Inedo.ProGet
Thanks,
Steve
@alex_6102 thanks for the additional information! That makes sense - you are effectively looking for an external authority for content verification.
As far as I'm aware, Debian is the only packaging system that support this, although though NuGet has some limited concept of "signing" for author verification. We evaluated it and it simply doesn't make sense.
That being said, if you're willing to experiment with what we have in ProGet today, we might be able to make it work. It'll require a little database "hacking".
First, you'll need a new instance of ProGet without an encryption key set-up. That's the default configuration for a Docker image.
After creating a Debian feed, take a look in the Feeds
table for the FeedConfiguration_Xml
column. You will find a base-64 encoded property called FeedSigningKeyRing
.
Here is how we generate that:
var bundle = new PgpKeyCollection { PgpKey.Generate(4096, $"{this.FeedName}@proget") };
this.FeedConfig.FeedSigningKeyRing = bundle.Encode();
Honestly I have no idea what that does or what format it's encoded as. This is a black box from the Org.BouncyCastle.Bcpg.OpenPgp
library, and we just store the bytes as you can see.
Here is how we use those bytes:
context.Response.AppendHeader("Content-Disposition", $"attachment; filename=\"{feed.FeedName}.asc\"");
var keys = new PgpKeyCollection(feed.FeedConfig.FeedSigningKeyRing);
using var output = context.GetOutputStreamWithCompression();
keys.WritePublicKeys(output);
If you're able to "replace those bytes" with a format that works, then we can add a page that allows you to specify your own. If it's just UTF8-encoded ASCII, awesome. If not, we need some kind of instructions on how to replace those bytes too.
Otherwise, this is beyond our current understanding and would require us to (re-)learn how all this stuff works, test it, etc. Which is what makes it nontrivial.
Thanks,
Steve
Hi @mickey-durden_1899 ,
Great! I've added the property as discussed and published AWS v3.1.4-RC.1 to our prerelease feed.
You can download and manually install, or update to use the prerelease feed:
https://docs.inedo.com/docs/proget/administration/extensions#manual-installation
After installing the extension, the "Disable Payload Signing" will show up on the advanced tab - and that property will be forwarded to the Put Request. In theory that will work, at least according to the one post from above.
One other thing to test would be uploading Assets (i.e. creating an Asset Directory) via the Web UI. That is the easiest way to do multi-part upload testing. I did NOT see an option to DisablePayloadSigning on the multi-part uploads, just on Put.
Thanks,
Steve
Hi @yaakov-smith_7984 , sounds like progress.... can you try to do a "group lookup" using the AD testing tool (page)? That's what the permissions page will do prior to saving.
That might give you a clue to the queries that are happening.
Hi @caterina,
It looks like Consoleman
is an internal Inedo NuGet library. It's available on https://proget.inedo.com/nuget/NuGetLibraries/v3/index.json
but I think we should publish to nuget.org
Will discuss with team.
Thanks,
Steve
The built-in Configuration doesn't seem to include IPs, which means you'd want to create a PowerShell script that could read/set the configuration.
Alternatively, it could be added relatively easy as a property, assuming it's something that can be relatively easily configured with the library/code we're using:
This is something we could definitely help to do as a paid user, but we're not focused on Otter free/community at the moment. If you're working with someone on the sales side, I'd bring it up -- these days, the Otter sales tend to involve a lot of implementation services from our partners and a lot gets done with scripting - but if you're having success implementing on your own we could likely work together in that regard.
Cheers,
Steve
Hi @m-karing_2439 ,
The link on the "Dependencies" is provided for convenience; it's simply wrapping a <a href=...
around items in the .nuspec file's Dependencies element. Without performing expensive lookups, there is no way to know if those packages are in the current feed, or if they're even valid packages or versions at all.
Keep in mind that, in many cases, dependencies won't even reference a single package (e.g. MyPackage-4.8.7) - but a whole range of versions (e.g. MyPackage-4.*). This is why it's impossible for ProGet (or any tool) to "resolve" the dependencies - it requires knowing all packages in project (plus the project's environmental configuration) to know which ranges to pick.
Note that, if those links yield a 404
then clients will also not be able to download them from thet feed.
Thanks,
Steve
Hi @yaakov-smith_7984 ,
Thanks for the additional details; so it sounds like the issue is that ProGet's Active Directory Integration does not support Active Directory group lookup using a "pre-Windows 2000" Group Name?
I'm not sure if this is something we'll want to support in the Active Directory integration. That integration is sensitive and performance-critical, so trying to add-in support for group names on domain configurations from the 1990's doesn't make a lot of sense.
For reference, here is our Active Directory Integration code:
https://github.com/Inedo/inedox-inedocore/tree/master/InedoCore/InedoExtension/UserDirectories/ActiveDirectory
Why Entra is using those ancient fields is beyond me, especially since all AD objects already have a unique ID. I suspect those are simply not intended to be queried or used by third-party tools. Maybe they're unstable and subject to change. I would try to get ahold of someone at Microsoft to get some answers on how to handle this.
That said, did you try the OpenLDAP/Generic directory? That will allow you to define your own LDAP queries. I think that's what you'll need to do.
Thanks,
Steve
@steviecoaster the main headaches come when you try to use a kind of CDN (Front Door) or anything else that will cache/manipulate/munge requests; it's often slower anyways so there's no real benefit
I know a few customers were hit with some kind of bug in Gateway that caused problems in clients (Docker particularly) - it was a long standing, multi-year bug in Gateway, so who knows if it ever got fixed
Hi @yaakov-smith_7984 ,
I'm afraid I'm not really sure what's happening; it's clear that they're doing something "pretty weird" behind the scenes, and the integration is designed to use Active Directory, not this system.
You can see the queries that ProGet is making -- and if those queries don't return those "weird" groups like $JQA200-TQZJ4H3I77X8, then it's not going to work.
Hopefully the system will work with LDAP queries. If you use the OpenLDAP/Generic directory, you can customize the property names and queries to hopefully get it working: https://docs.inedo.com/docs/installation/security-ldap-active-directory/various-ldap-openldap
Let us know what you find out,
Thanks,
Steve
@james-woods_8996 good to know - I've just removed that from the docs since I think that was an instruction from earlier versions (before ProGet 2022 maybe)
Hi @james-woods_8996 ,
Setting the ASPNETCORE_URLS
is the correct way to address this; that troubleshooting guide is outdated (though it probably still works to write out the config file like that).
As far as I understand, this behavior has not changed in ProGet 2025 and has been the default behavior since ProGet 2022. We had considered changing the Linux ports to be 8624/8625 to mirror Windows defaults, but decided to just update the documentation to clarify.
Thanks,
Steve
Hi @jw ,
You can ignore the message. There is no relation to a package; someone just happens to have that URL open in the browser, and the page is using AJAX to request updates to display the "alert bar" at the top. Eventually they'll close it and the error will go away.
Thanks,
Steve
Hi @jw,
That URL is for updating the notification bar -- typically you'll see this when someone has a browser Window open with the old version of ProGet that is attempting to load notifications from a URL that no longer exists.
Thanks,
Steve
@kc_2466 thanks for the suggestions, we'll also get this in a future maintenance release via PG-3067 ... should be trivial!
Hi @kc_2466 ,
Thanks! We'll add this via PG-3066 in an upcoming maintenance release, hopefully later this week if it's as trivial as it seems :)
Thanks,
Steve
Hi @parthu-reddy ,
There's nothing in ProGet that would cause a file name ending with a .spd
fail. Just to double check that it's not related to the file name, I would rename it and see if it works.
The main thing that's jumping out to me is that there are large files (3.1GB). Those are uploaded using a chunked process, orchestrated by the browser. I would keep "playing" with this and see if you can identify any patterns.
The message from your firewall team is saying that they are seeing RST
messages from the client and server. That's unusual, and means the connection is forcibly closed. ProGet does not operate at the network level and there is nothing that can cause ProGet to issue an RST
command, which means it's most certainly network related.
I think you're on the right track with investigating at the network side -- but unfortunately you're going to have to find out what is issuing a RST. The firewall might not, but maybe there are other network devices that are?
Maybe something is "taking too long" and getting RST.
Thanks,
Steve
Hi @alex_6102 ,
Can you help us understand why one would want to use custom signing keys?
When a Debian repository (i.e. ProGet) is configured over HTTPS, signing the InRelease
index is totally redundant and pointless. Moreover, from a security standpoint, these signing keys are cryptographically inferior to HTTPS.
The only reason to use signed indexes is so that apt
doesn't issue a warning. They serve no other purpose in an HTTPS ecosystem.
On our end, it's a nontrivial effort so it seems like a waste of limited resources.
Thanks,
Steve
If you just download a new version of InedoHub from https://my.inedo.com/ this will work - for whatever reason, the self-updating function of Inedo Hub must not have worked for you.
Thanks,
Steve
Hi @koksime-yap_5909 ,
This is a known issue and will be fixed via PG-3063 in teh next maintenance release, shipping later today.
Thanks,
Steve
Hi @koksime-yap_5909 ,
That's just a "suggestion box" in the UI - you can just type in whatever you'd like. But in general you shouldn't upload files via the UI, but using maven
.
Thanks,
Steve
Please What is a "Connector" in ProGet? to learn more about how to accomplish this in ProGet.
Since Artifactory and Nexus are basically the same thing, here is some more context on comparing the "mental model" between those products and ProGet
https://blog.inedo.com/proget-migration/how-to-manage-repositories-in-proget-for-artifactory-users/
Thanks,
Steve
No; when you use LDAP, then you need to manage the LDAP users' groups in your directory. Here is a recent discussion on the topic:
https://forums.inedo.com/topic/5471/feature-suggestion-allow-ad-users-to-be-added-to-built-in-groups
Thanks,
Steve
Hi @patrick-tessier_4584 FYI,
dotnet nuget
and nuget.exe
are effectively the same thing - they both invoke the NuGet CL.. The nuget.exe
is a standalone version of the CLI and dotnet nuget
is part of the .NET SDK.
The issue is that your copy of nuget.exe
was ancient and didn't work.
Cheers,
Steve
@patrick-tessier_4584 thanks for letting us know!
NuGet 2.8.3 was released October 17, 2014, which I believe predates ProGet. It probably uploads things in quirky way that we haven't tested in very many years.
In any case, it sounds like it's working now and the issue was the ancient version of nuget.exe
@jfullmer_7346 this is a bit tricky behind the scenes but hopefully will be resolved with PG-3047 -- which we hope to get in the next or following maintenance release
The underlying error is network-related; perhaps the traffic is being interfered with, the client is prematurely disconnecting, etc. There are no settings in ProGet nor issues with your ProGet configuration.
Unfortunately that's a bit challenging to troubleshoot, but I suggest you start with a tool like Fiddler Classic or ProxyMan, and see what reuqests are being sent to ProGet.
Then try to reproduce using only curl (not the client tool). Then bypass network equipment / proxies / etc., running the request entirely on the PRoGet server/container if you have to.
Hope that points in the right direction; please share what you find.
Thanks,
STeve
Hi @kc_2466 ,
Thanks for the report; we'll get this fixed in the next maintenance release (this week) via PG-3064
Thanks,
Steve
That makes perfect sense; when package caching is enabled, the package files are added to the feed and stored on disk. When it's not enabled, they are simply streamed from the server.
Something is interfering with your disk storage that is deleting (quarantining) files on disk. You will need to follow the troubleshooting I identified above - it's probably a security tool.
Suggest using AWS Lightsail to get a test box in an environment you can control.
Thanks,
Steve
Hi Marc,
I'm not sure what you mean by 2024.8-r1
but the -ci.x
and -rc.x
images are prerelease versions. Sometimes they are the same as the release versions, sometimes they aren't.
I wouldn't use prerelease versions unless we suggest them to you for a specific issue you're facing.
-- Dean
Hi @v-makkenze_6348 ,
You can use a ProGet Trial Key or if this is just a quick temporary instance, just use your existing license key to try it out.
Thanks,
Steve
Thanks @imm0rtalsupp0rt, this is exactly what I was looking for -- I'd like to be clear what's wrong though.
You're saying the $.data[].versions[]
array only contains the latest version, and not all the versions?
Hi @carsten_1879 , just to give another update.
The issue appears to be localized to your ADO Server, perhaps it's the version or locale. The authentication header doesn't seem to make any difference.
Whatever the case is, the library we use, libgit2
, simply cannot clone from it -- for whatever reason your ADOS is returning a 400
with some German error message when doing what should be a totally fine request. We did not test with git.exe
.
Our new, proprietary library also does not work, but for a different reason - it looks like ADOS is using an ancient format (like from 2010??) for one of the reposes, so we don't support that yet. At least we think.
We'll need too spend a bit more time researching this, so it'll get pushed into the following weeks.
Thanks,
Steve
@steviecoaster @imm0rtalsupp0rt in the screenshot above it's not clear what API is being used, what queries are being made, and what's expected -- if you could provide us with that, we could troubleshoot this very quickly
@stefan-hakansson_8938 as you noticed, ProGet's Debian connectors are not currently designed to handle the gigantic, operating-system mirrors very well. This is because they are always refreshed "on demand" - which is what you want for CI/CD workflows.
It's not great for public repository mirroring, however. In Q4 or later, we will explore adding an option to do periodic updates.
@v-makkenze_6348 thanks, unfortunately I wasn't able to reproduce it
Can you try a manual import by going to Reporting & SCA > Projects & Builds > Import SBOM > paste in your SBOM?
If that gives an error, can you also try to edit application name/version in the XML before pasting? Fox example, change it to:
<component type="application">
<name>MyTestProject</name>
<version>25.3.85.1</version>
</component>
That will create it as a new project/build.
Just trying to figure where the issue might be.
Thanks,
Steve
@carsten_1879 thanks for the update
We were able to reproduce this using your credentials, and It's definitely an "error reporting an error" on Linux. On Windows it's clearer ("too many redirects or authentication replays").
We do not test on ADO Server, but we quite a few customers users with it - who knows why it works for them but not you. Anyway, please bear with us as we figure it out.
Hi @imm0rtalsupp0rt ,
We don't know choco search
is doing behind the scenes with regards to API calls, but I think it's using the V2 ODATA API. That gets pretty complex and we don't have enough information to work on yet - and trying to set up a reproduction case is quite the endeavor.
Could you provide a very basic reproduction, perhaps with a dummy package or two, using only the API calls? If you use something like Fiddler you can see the underlying API calls
Thanks,
Steve
@erich_1530 in the ProGet software, you will see a "License REquired" error message. if you click on the "Request License Key" you can get a Free or Trial key from within ProGet. Or you can go to my.inedo.com
and request a key from there if your server does not have internet access
@carsten_1879 excellent, thanks! Please give us a little time to review this.
We will also test it with the new Git library, which is available in buildmaster:24.0.8-ci.9
container image -- you have to go to Admin > Advanced Settings and enable it.
If you get a chance to try it yourself, please let us know. We are currently testing it ourselves on our main server.
Hi @james-woods_8996 ,
No problem using ProGet 2025 Docker Image with SQL Server - it works exactly as it did in ProGet 2024.
We will continue to support SQL Server in the through at least ProGet 2026.
Thanks,
Steve
@phopkins_6694 perfect, just what I was looking for
I was able to reproduce this - searches via the NuGet API (which connectors use) will consider tags, while local searches seem to only be looking at names.
I'm not sure how long this has been the case, but we'll try to get it fixed soon. Unfortunately it's not a trivial fix, hopefully we'll do it via PG-3046, likely in the July 18 maintenance release.
Thanks,
Steve