Hello; I've updated the documentation to clarify this, but it's available starting in ProGet 5.2.9. So, you'll need to upgrade to enable it :)
Welcome to the Inedo Forums! Check out the Forums Guide for help getting started.
If you are experiencing any issues with the forum software, please visit the Contact Form on our website and let us know!

atripp
@atripp
C# developer by trade, but writing less and less code and more and more specs.
Best posts made by atripp
-
RE: Service Health API call returning 404
-
RE: NPM Connector returns plus "+" in versions
Thanks for the update! I've noted this in the docs, and linked to this discussion :)
https://github.com/Inedo/inedo-docs/commit/d24087911584bbda833314084a58c2ae1ff41c39
-
RE: [ProGet] [NativeApi] NpmPackages_DeletePackage not working.
Hello,
That API will only delete package metadata from the database, not from disk. It's mostly intended for internal use only, and probably shouldn't be exposed to the API. In any case, we don't store the
@
with internally, so if you change@myscope
tomyscope
it should work.Note that the NPM doesn't provide a way to delete packages, and we never implemented it. There hasn't been any demand for it to date, as people don't really delete packages programmatically - but you're definitely welcome to submit a feature request and help us understand why it'd be a value (like, the workflow you use that requires deleting packages, etc).
Alana
-
RE: Creating PowerShell repository, protecting pull/download by API key
Hello, for sure!
It's pretty easy; just don't give the
Anonymous
user any access to your feeds, and then authentication will always be required, either when browsing the ProGet application or using the API (such asInstall-Module
).When you use the Register-PSRepository command, you can the
Credential
option to specify a credential.This credential can be the name/password of a user inside of ProGet (let's say,
Admin:Admin
), or it can be username ofapi
with a password of an api key you've configured (so,api:my-secret-key
). -
RE: Restricting API access to View/Download
Hello;
The Native API is for low, system-level functions, and it's "all or nothing". If you give someone access to Native API, you are effectively making them an administrator, as they can also change permissions and grant admin privileges. So, I don't think you want this. Instead, you'll want to use the Debian API endpoint that we implement.
It's a third-party API format
In order to support third-party package formats types like NuGet, npm, etc., ProGet implements a variety of third-party APIs. We only provide minimal documentation for these APIs, as they are generally either already documented elsewhere. However, you can generally find the basics by searching for specific things you'd like to do with the API, such as "how to search for packages using the NuGet API" or "how to publish an npm package using the API".
So in this case, I recommend to search "how to view and download apt packages".
-
RE: PyPI package not shown in search results accessible via url
I'm not very familiar with PyPi packages, but I know there are some oddities with
-
and_
, and that they are sometimes supposed to be treated the same, and sometimes not. We don't totally understand all the rules, to be honest (even after reading PEP503 specifications).In this case, the package is actually
websocket_client
, notwebsocket-client
.See: https://pypi.org/project/websocket_client/
When you search for
websocket_client
in ProGet, it shows up, as expected. -
RE: How to find out package disk space?
In ProGet 5.3, we plan to have a couple tabs on each
Tag
(i.e. container image) that would provide this info: Metadata (will be a key/value pair of a bunch of stuff), andLayers
will show details about each of these layers.That might help, but otherwise, we have retention policies which are designed to clean up old and unused images.We'll also have a way to detect which images are actually being used :)
-
RE: [BUG - ProGet] Not able to remove container description
As @apxltd mentioned, we've got a whole bunch planned for ProGet 5.3.
I've logged this to our internal project document, and if it's easy to implement in ProGet 5.2 (I can't imagine it wouldn't be), we'll log it as a bug and ship in a maintence release.
Do note, this is not an IMAGE description, it's a REPOSITORY (i.e. a collection of images with the same name, like
MyCoolContainerApp
) description; so this means the description will be there on all images/tags in the repository. -
RE: [Question - ProGet] Are versions amount wrong ?
You're right, I guess that's showing the "layers" instead of the "tags"; I think it should be showing container registries separately (they're not really feeds), but that's how it's represented behind the scenes now.
Anyways we are working on ProGet 5.3 now; there's a whole bunch of container improvements coming, so I've noted this on our internal project document, to make sure we get a better display for container registries.
-
RE: Anonymous user can see list of packages and containers
@Stephen-Schaff thanks for the bug report, I verified that this may happen depending on permission of user, and which feeds they can/can't use --- but it seems an easy enough fix that we can do via PG-1894 (targeted to next release) - the packages can't be viewed upon clicking, but it's a sub-optimal experience for showing packages they can't see
Latest posts made by atripp
-
RE: Pulling dependencies from ProGet in gradle
I'm not really sure, but I'll explain how things work so it might help troubleshoot.
First, the Maven API does not provide a file listing. While you (as a user) can often "see" via an HTML page (like this listing at jboss.org), it's simply not available for listing in the API. The only required file in an artifact is the
.pom
file, so when you "pull" an artifact to ProGet, that's all you'll get.Next, the "remote" icons next to the files indicate that they were cached (i.e. added to the feed) via a connector. That means the files were successfully written to disk, recorded in the database... but now they are gone. Hence, why you keep getting the "file not found" message.
The most likely culprit for this is something deleting the files from a packagestore. We often see security tools doing that for "safety" reasons, since they are
.jar
files that may be dangerous I guess.Overall, the maven API is a very simple series of GETs. So perhaps, you can just experiment with this? First, start with a brand new feed and a connector.
Then, run the command:;
curl http://my.server.local:8624/maven2/my-new-feed/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar --output hamcrest-core-1.3.jar
You should see the file download via CURL. Afterwards, you should see the artifact as a cached package (Top Navigation > Packages > Select Cached). You should also see that
.jar
file on disk, written to the package store location (Manage Feed > Storage).If you don't see a .jar file downloaded to that location on disk, then it means something is "blocking" the file being written. If it's there, then it means something is deleting it after.
Thanks,
Alana -
RE: Rust - invalid gzip header
Hi @rel_0477 ,
Sounds like this is a pretty specific edge case. Can you provide a reproduction case so we can take a look?
Thanks,
Alana -
RE: Debian feed mirror Performance
Hi Dan,
In general, a ProGet feed will be slower than a "real" Debian repository. The reason is that Debian repositories are just static file system like this:
http://ftp.us.debian.org/debian/dists/bookworm/There is obviously a lot more overhead with each ProGet request, since index files are dynamically generated, involve connectors, need to be permission-checked, etc.
In addition, ProGet indexes the Debian indexes on-demand, which means downloading all of the Contents-*.gz files and indexing those. Like in here: http://ftp.us.debian.org/debian/dists/bookworm/main/
While these files are cached, they do need to be updated when the remote repository updates.
And note that each web node maintains its own local index cache, so you will see these long-running requests multiple times. Some organizations will periodically "warm up" the indexes periodically by just hitting the InRelease endpoint.
Hope that helps,
Alana -
RE: IIS/WIA deprecation and support
Hi @sgardj_2482 ,
ProGet's Integrated Web Server already supports Windows Integrated Authentication (WIA). Actually it supports WIA better than better than IIS in that only the Web UI and supported feeds (or ones you configure) will do the WIA challenge.
Behind the scenes, the Integrated Web Server (IWS) uses Kestrel unless you're doing so-called "port sharing", which would be having two web applications sharing the same port. That's becoming less and less common, and isn't something Microsoft recommends anymore.
If you bind to a host name in IWS (i.e. port sharing), then the operating-system level
HTTP.SYS
component is used. That is much less flexible with WIA and every request must be authenticated. Which means it'll never work with Docker, npm, etc. So it's not recommended.Thanks,
Alana -
RE: HTTP 403 response
Hi @michal-roszak_0767 ,
401/403 are not logged, so you won't see a server side event.
403 means authentication was successful but the permissions are not OK
My guess is that the wrong feed or credentials are being specified. Like maybe using an API key?
Thanks,
Alana -
RE: Error using HTTP Request
I'm not sure, but maybe it's something simple like a typo. I don't see it though.
That
401
message will occur when credentials weren't sent.Behind the scenes,
pgutil
uses that API and authenticates by adding the same header:
https://github.com/Inedo/pgutil/blob/thousand/Inedo.ProGet/ProGetClient.cs#L37Perhaps you can use a proxy like Fiddler or ProxyMan to see the difference in HTTP Traffic?
Thanks,
Alana -
RE: Unexpected URL for feed after creating with /api/management/feeds/create and endpointURL
It sounds like you want to enable API v3, which uses the `v3/index.json' URL suffix?
In that case, make sure to set the
useApiV3
property is set. Also, I don't think you can set all those properties on create... you may have to create, and then update.Also note that you cannot set the
endpointUrl
property, it's just readonly. That is generated based on the incoming reques, so if you're viewing it onlocalhost
you'll see that. If you view it onmyserver.corp
you'll see that, etc.Hope that helps,
Alana -
RE: Unable to GET from connector "nuget.org"; using cached copy.
Hi @parthu-reddy ,
This looks more related to Server Overload / Database than anything to me. Note how all the requests are coming in at exact same second. Can you throttle your load balancer a bit, so they don't all hit at the same time? Even a slight delay will help.
FYI - we are tracking a recent regression to SQL Server analysis engine (???) that is causing one particular query (NuGet_GetPackage) to go incredibly slow under extreme traffic. For some reason, it's using the wrong plan suddly. It's been happening to a few users after a recent upgrade/patch to SQL Server. We have a work-around but would like to test it in the field against a user.
Thanks,
Alana -
RE: Using LDAP on Buildmaster located in a container (Linux)
Hi @marc-ledent_9164 ,
This is available in InedoCore-3.0.4, so if you go to Admin > Extensions, you should be bale to update.
thanks,
Alana -
RE: Error when attempting to connect BuildMaster to Bitbucket Cloud
Hi @mhelp_5176 ,
I haven't investigated BitBucket Cloud any further, but it sounds like there's an issue with the integration -- and it's definitely something we can look at later. My guess is that it's some kind of change to the API/authentication. But we're all pretty focused on getting ProGet 2025 out the door, so it'll have to be after that.
That said, the main difference between connecting to a "Git host" like GitHub, GitLab, Gitea, BitBucket, etc. vs a "generic Git repository" is that there will be some intelligent drop downs to help you select a repository. There are a few other differences as well, but mostly it's UI.
So for example, on a GitHub connection, you'll see a list of organizations and repositories, and then would select the one to connect to. Compare this to the "Generic Git repository", where you simply paste in the repository clone url.
But in either case, you need to configure each repository connection individually; typically each application will have one repository, which is why this is part of the application creation process.
It's definitely not a common practice to be able to "pull in all the repositories in a workspace" at once - that's not really how Git works, and it would involve some kind of script that iterates a list of repositories and clones/updates each one indivdiaully based on configured remotes in subfolder.,
Cheers,
Alana