Ok, thanks for the information.
Posts made by jw
-
RE: Variable with package url for webhook
For my specific use case an URL to the package would already be sufficient. Could that be made available easier? The code that executes these hooks already must have some context of the package (Id, Version, etc.)?
-
RE: ProGet NuGet upload user tracking
I changed the log level to "Everything" but nothing shows up in the logs at all ("There are no API request logs."), pretty odd..
I'm also fairly certain that the key itself is working properly, because when I comment it out in my nuget.config I get a login prompt from nuget.exe when trying to push.
-
RE: ProGet NuGet upload user tracking
My permissions look like this:
I'm pushing with the api key of the greyed out user who has admin rights.
I also tried removing the Anonymous rights here entirely, but no change as expected, since these are just viewing rights.
-
RE: HTTPS with self hosted ProGet and internal web server
I saw this has been implemented a while ago, so thank you for that.
Some feedback:
HTTPS Binding to a Hostname
Essentially worked like a charm.
Maybe a minor thing for the documentation that could help people in the future:
I opened the certificate and blindly copied the first thing that looked like a thumbprint hash (Yea I know, reading is hard...). But in case of windows that is the Serial number and not the thumbprint, which of course totally won't work.
So a little hint the documentation could be helpful, e.g.
cerhash is the thumbprint of you SSL certificate
=>
cerhash is the thumbprint of you SSL certificate NOT the serial number
HTTPS Binding to a Port (Experimental)
Also worked nicely, here also just a minor documentation nitpick:
In the line
"For example, if you wanted to bind to port 4403 for NETWORK SERVICE, you'd run this command:"The port is probably a typo? At leas the next lines say something different.
Also first it says
netsh http add urlacl url=https://*:443/ user="NETWORK SERVICE"
but then a few lines down it says
<WebServer Enabled="true" Urls="https://*:8625;http://*:8624" />
A couple lines down the 2nd line shows up again and even further down it shows up again with yet different ports.
For this configuration to work properly these settings must match up.
-
ProGet symbol proxy
Hi,
we're currently looking to migrate away from Nexus and one feature that is missing is a proxy server for symbols.
https://github.com/sonatype-nexus-community/nexus-repository-microsoft-symbol-server
The proxy should essentially behave similar to a NuGet caching proxy for nuget.org, but instead for the Microsoft and the NuGet symbol servers.
Can this be done somehow or are there plans to support this?
-
Variable with package url for webhook
Hi,
is there a variable that contains either the base url to the webinterface or the full url to a package that can be used in webhooks?
I've created a message card with a clickable link like shown below. The only thing that I currently have to hardcode is the base url to the ProGet webinterface.
Is there a better way to do this?
{ "@type": "MessageCard", "@context": "http://schema.org/extensions", "title": "$FeedType $PackageId $PackageVersion was $WebhookEvent", "summary": "summary", "sections": [{ "facts": [{ "name": "User:", "value": "$UserName" }, { "name": "Link:", "value": "[$PackageId $PackageVersion](http://192.168.0.1:8624/feeds/$FeedName/$PackageId/$PackageVersion)" }, { "name": "Feed:", "value": "$FeedName" }], "markdown": true }] }
-
ProGet NuGet upload user tracking
Hi,
two issues with user tracking for NuGet uploads
-
I can't find a way to see which user uploaded the package. Both on the metadata and the history page the upload time is shown, but not who uploaded it.
Is there a way to show this? -
When uploading a NuGet package via ApiKey it seems like ProGet does not identify the user correctly. I've set up a Team notification hook for testing and the user is only shown when uploading via website. When pushing the package via nuget.exe the user is only shown as "Anonymous".
How can this user recognition be enabled for uploads via API?
-
-
Email notification about new ProGet versions
Hi,
is there a way to be actively notified when a new ProGet version was released?
Some sort of newsletter one can subscribe to?
I know that the webinterface will show something, but I'm looking for something less passive.
-
Remember me functionality for ProGet webinterface
Hi,
is there a way to activate a "Remember me" functionality in ProGet?
An example from GitLab:
The basic idea is for the session cookie to be valid for a (ideally configurable) amount of time, so one doesn't have to log in over and over again when closing and reopening the browser.
-
RE: Uploading snupkg using NuGet client
@lm FYI, our plan is to create a second URL for pushing symbol packages, and then essentially save the files pushed to that URL as
.snupkg
, next to the.nupkg
files on disk. This URL will be wired toSymbolPackagePublish
and documented.Do you mean like a second URL for a single feed? Would the .snupkg then also show up in the feeds search results?
I think it would be good to at least have an option to hide them from the regular package list. Also deleting them together would probably also make sense.
-
RE: Uploading snupkg using NuGet client
You're correct -- that would probably work, but we thought a "single feed approach" would be much better from a user-experience standpoint. This way, there's just one feed to configure, and you push package files and/or symbol files to that feed using that default NuGet configuration. The files will be are stored on disk, right next to each other.
True, though there are also a few downsides worth mentioning. Signatures don't work anymore and the package itself is always different (Size, file hash) to what a build server might spit out, making protentional issues in the pipeline harder to spot. Once the SymbolPackagePublish resource is implemented there really is no downside for the user anymore, a single nuget push command will do.
Also another pleasant side effect to this is, that it would now be possible to manually push .snupkg to a ProGet feed with nuget.exe, because even in this case the client is checking for this resource on server side before doing anything.This is something we're planning for ProGet 2023, but can probably do as a preview feature in December or January.
That is good to hear, thanks.
-
RE: Uploading snupkg using NuGet client
In PG-2154 there is a comment "Update: this was not trivial and we will take a new approach."
Is there any update for this?
I did a quick an dirty test and to me it seemed rather easy to get it to work. I create two ProGet feeds, then downloaded the index.json from the first feed, added the SymbolPackagePublish resource to the json and pointed it to the second feed. That file was then copied on a webserver. Using its URL as a -Source for nuget.exe or putting it as defaultPushSource in the nuget.config immediately solved the problem.
If I'm not completely wrong, all that is needed is a configuration option at the first feed "Use Feed x as symbol feed" and as soon as that is configured, the index.json of the first feed can return the correct SymbolPackagePublish resource and everything should just work.
-
RE: HTTPS with self hosted ProGet and internal web server
Sounds like a plan.
Looking forward to trying it out.
-
RE: HTTPS with self hosted ProGet and internal web server
Obviously it's pretty easy to just wire up values (and we'd be happy to do that), but my only hesitation is documentation and cross-platform compatibility.
Cross-platform should not be an issue, since this is one of the reasons Microsoft created Kestrel in the first place. Of course the Windows cert store part of the configuration does not make sense in a Linux context, but that should be pretty much the only limitation.
Brainstorming a bit but... from the UI, we will just support one URL I think. I don't know why you'd want multiple...
From what I understand at least two URLs are needed to support HTTP and HTTPS side by side, but that could be abstracted away by the UI.
The docs you linked from Microsoft are confusing, and it's not really clear what fields to use or why. I guess, in general, you'd select a file (which InedoHub would generate?) or the cert store (that's Windows only, right?).
There are basically four sections that are of interest:
"Http" => Plain HTTP (Similar to Urls but just one host per endpoint)
"HttpsInlineCertFile" => PFX file based (Most likely be used by Windows users that don't want to deal with the cert store)
"HttpsInlineCertAndKeyFile" => .crt+pem file based (Most likely used by Linux users)
"HttpsInlineCertStore" => Windows cert store integration, only works on Windows.I think the other parts can be ignored.
And I guess what, when you generate a cert yourself, it's a pfx or something? I always forget the differences....
PFX is just one possible certificate format, it is possible to generate all types of self-signed certificates. I believe the majority of business/enterprise users will probably use a cert they get from their IT department from the company-wide PKI and then they just need different options to plug that into ProGet.
Letting the user installing ProGet generate a certificate can be done, but it will show up as insecure (red, broken lock, etc.) in all browsers and also package managers will probably complain about it being untrusted. So I'm not 100% sure that implementing this time well spent, since there are a million guides on the internet how to generate a cert.How would you suggest we modify that element to support the different options they have?
I would probably switch from the Urls parameter to a list of endpoints. This way the HTTP+HTTPS scenario can be supported as well as the the 4 options above.
The JSON from the example could simply be translated to XML 1:1.<WebServer Enabled="true/false"> <Kestrel> <Endpoint> <HttpsInlineCertStore> <Url>https://192.168.0.1:443</Url> <Certificate> <Subject>my.company.domain.local</Subject> <Store>My</Store> <Location>LocalMachine</Location <AllowInvalid>false</AllowInvalid> </Certificate> </HttpsInlineCertStore> </Endpoint> </Kestrel> </WebServer>
It should be possible, using the Microsoft.Extensions.Configuration.Xml, to read the config file into an
IConfiguration
object and feed the relevant portion into Kestrel, this is pretty much what we currently do. This way we don't have to reinvent the wheel with HTTPS configuration and can offer all the flexibility that Kestrel offers right out of the box.new HostBuilder() .ConfigureWebHostDefaults(webBuilder => webBuilder .UseConfiguration(configuration) .UseKestrel((context, serverOptions) => { serverOptions.Configure(context.Configuration.GetSection("Kestrel")); })
-
RE: HTTPS with self hosted ProGet and internal web server
We are also interested in this support, since we really don't want to go down the IIS route.
Yes. Not exactly sure how yet, but we would like the integrated webserver to support this as easily as possible.
From what I can tell ProGet seems to be an ASP.NET Core 6.0 application and the underlying (internal) webserver is probably Kestrel. We faced a similar requirement with our own product running the same stack.
A first easy way to support this would be to expose more of the relevant Kestrel config: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/servers/kestrel/endpoints?view=aspnetcore-6.0#replace-the-default-certificate-from-configuration-1
We are particularly interested in the
HttpsInlineCertStore
option, to be able to use certificates directly from the Windows cert store.So maybe extending the ProGet.config with the relevant parts could be a good first step, before any more fancy features like UI integration or self-signed cert generation are implemented.
<InedoAppConfig> <WebServer Enabled="true" Urls="http://*:8624/"> <endpoint config goes here /> </WebServer> </InedoAppConfig>