Thanks for the update @Stephen-Schaff -- And just to add to this, you should see "Note that you can use the following alternative tags to refer to this image:..." on the browse image page.
Welcome to the Inedo Forums! Check out the Forums Guide for help getting started.
If you are experiencing any issues with the forum software, please visit the Contact Form on our website and let us know!

Posts made by atripp
-
RE: Applying an alternate tag hides the history of a pre-release tag
-
RE: ProGet shows "(500) Server Error: Value cannot be null. (Parameter 'version')" when opening "Dependencies" tab of Maven artifact
@jndornbach_8182 said in ProGet shows "(500) Server Error: Value cannot be null. (Parameter 'version')" when opening "Dependencies" tab of Maven artifact:
Is there a way that ProGet can cope with "parent-managed" dependency versions in future or, at least, does not throw this error?
For sure! It does look like this is supported in most places, but not on this page in the UI.
Easy fix... I logged this as PG-1968, and it will ship in the next maintenance release of ProGet, which is scheduled for next Friday. Or I can show you how to download a pre-release if you'd prefer sooner :)
-
RE: ProGet says "500 Internal Server Error" when deploying via maven
Hi @jndornbach_8182,
Thanks for posting all the information, it's really helpful!
With that stack trace information, we can see where the error is occurring... and as the error says, the PUT body doesn't contain XML as the code expects.
if (context.Request.HttpMethod == "PUT") { if (info.IsMetadataRequest) { if (info.HashAlgorithm == null) { var xdoc = XDocument.Load(context.Request.InputStream); var metadataElement = xdoc.Element("metadata"); if (metadataElement != null) { var versioningElement = metadataElement.Element("versioning"); if (versioningElement != null) { var releaseVersion = (string)versioningElement.Element("release"); if (!string.IsNullOrWhiteSpace(releaseVersion)) await new DB.Context(false).MavenArtifacts_SetReleaseVersionAsync(feed.FeedId, info.GroupId, info.ArtifactId, releaseVersion.Trim()); } } } else { // Don't need to actually save the hash since it's computed on demand } context.Response.StatusCode = 201; } }
If I'm being totally honest, I don't understand why the code is doing what it's doing
But that doesn't mean I can't help fix it! Could I trouble you to attach a tool like Fiddler to capture the requests, and then send us the session file? We can then inspect that PUT request, and see what's going on.
You can send to support at inedo dot com with a subject of
[QA-597]
-- but please let me know when you do, so I can dig in the box and find it.Thanks,
Alana -
RE: Is my understanding of ProGet incorrect?
Hi @Ashley_8010
In nearly all of our external retail websites, there are tonnes of connections to third party websites which inevitably drive up the loading times for our customers. Some loading times can be upwards of 10+ seconds (depending on huge sneaker drops or major releases)
It sounds like what might need is a sort of web bundler/packager (perhaps like webpack)?
And then a Content Delivery Network on top of that, that can cache and serve static content faster than your server? We use Cloudflare ourselves.
ProGet does allow you to access individual files within a package (see the files tab on InedoLib for example), and download those files with a URL like this.
However, this feature was not designed to be a front-end web server, and I wouldn't recommend using it as such.
Cheers,
Alana -
RE: NuGet package with both dll and symbols???
Hi @joel_6345 ,
Can I trouble you to read the latest version of the documentation?
https://docs.inedo.com/docs/proget-feeds-nuget-symbol-and-source-server
Actually I just rewrote it, so I'm really hoping that this can clarify your question .. and that it would have made it easier in the first place :)
thank you
-
RE: API Method to Get a List of Helm Chart Versions
Hi @Stephen-Schaff ,
I'm not sure about the API, but ProGet implements the Chart Repository API, and from a look at that, you should be able to just access
index.yaml
at the API Endpoint URLHere is some information about the format of that file: https://helm.sh/docs/topics/chart_repository/#the-index-file
Can you give that a try and let us know what you find out? Cheers :)
Cheers,
Alana -
RE: Permissions fine-tuning for NuGet feeds
Hi @coskun_0070 ,
Did you try setting an api key with
setApiKey
? Perhaps there's another way to suppress this messages?While it's relatively easy to add privileges and features, we've learned the hard way that it creates a lot more work in the long-run from a support standpoint and user confusion. It's best to keep things simple.
I think this is something addressable via nuget client configuration.
Cheers,
Alana -
RE: Permissions fine-tuning for NuGet feeds
Hi @coskun_0070 ,
If I understand correctly, the issue is that you're having a hard time getting
dotnet nuget push
to work without granting anonymous access to view feeds?In this case, I believe you need to add the URL as an authenticated package source. This will also let you download packages with
dotnet nuget restore
.I believe this issue is resolved by
dotnet nuget add source
.https://docs.microsoft.com/en-us/dotnet/core/tools/dotnet-nuget-add-source
Cheers,
Alana -
RE: Cannot upload deb file to feed
Hello just confirming we received it! We can now begin the investigation of the problem from here.
-
RE: Getting "unexpected HTTP status 503 Service unavailable"message with proget
@joshy-mathew_7277 what type of package is this?
HTTP is not a great protocol for pushing really large files like that, and things like IIS and Middleware breakdown with huge requests. ProGet doesn't have any limits per se, but the 503 error means that something between ProGet and your browser is killing the request. It's usually IIS.
Most of the client tools (NuGet, etc.) do not support "chunked" file uploads (Docker does), which is why we recommend using drop folder for these large files.
-
RE: Looking for advice on Best Practice
Is there an accepted Code of Practice for managing prerelease stuff?
Yes, the general rules to follow are these:
- Packages are immutable; do not delete/republish packages as part of your normal workflow
- Use Prerelease Packages & Repackaging to take tested/validated prerelease packages to stable packages
- Use Package Promotion to move packages across feeds
- Use Retention Rules to automatically cleanup unused prerelease packages
Do people publish prerelease stuff to a different feed and only post the released stuff to the more public feed? Or is it just expected that once the release is out that the prerelease builds are simply removed from the feed?
Yes to both
On our ProGet Instance, we use both patterns.
- Extensions (plugins) have two feeds: Extensions (Stable) and PrereleaseExtensions; the reason is that we use CI (build automation), and publish a new package on every commit, and these could be really unstable - we don't want anyone using a prerelease extension unless we explicitly point them to it
- Our NuGet libraries have one feed (NuGetLibraries) that has both prerelease and release packages; these are not intended for anyone other than Inedo Engineers, and we have policies, practices, and training in place to make prerelease packages are shipped appropriately
My advice for deciding which pattern to follow would be looking at the consumers of your feeds/packages (i.e. who uses your packages vs who publishes your packages).
Using a single feed that has release and prerelease packages requires more training for developers. If one of your developers accidently uses a prerelease package and commits that, then it's going to cause problems. Even if you catch it before production, it will waste time and resources.
-
RE: Connection issues when configuring LDAP on Linux container
@kichikawa_2913 we've reviewed this a bit more as a team, and believe that there are a few things to consider here.
At first, it's clear you have a large, "older" Active Directory. There is a tremendous amount of customization one can do to Active Directory, and do enough of them over the years, and you end up with a "older" directory that has layer of layer of compatibility shims. You should see the crazy hacks they had to implement to get MSA accounts working...
It's important to note here is the fact that Microsoft Active Directory and .NET (Core) do not play nicely together. It took Microsoft over 10 years to get .NET Framework to work with Active Directory, and it's still really quirky. We've worked-around as many of the bugs as we can.
Microsoft is still trying to get .NET Core on Linux to work properly with Active Directory, but it's got a very long way to go as you're seeing. There are so many strange behaviors we've already had to work-around (like methods sometimes returning strings, sometimes returning byte arrays) -- and these behaviors will just come with new versions of their library.
For all we know, the crazy "2 or so minutes" to do a login query could be a parsing error in their library? Or something timing out in their network code, but not logging an error? We saw all that in .NET Framework. In any case, we can only guess because their library provides no diagnostic information for us to use.
At this point, you should open a support ticket to Microsoft. This is the only way we can see how to identify why you have a "2 minute or so" delay to run a basic login query.
The code we have is really, really simple. It follows all of Microsoft's guidelines, and it'd be super-simple for you to reproduce the exact problems for them to show them. They have some advanced monitoring tools that can detect exactly what crazy stuff is happening between the query and Active Directory.
We can't do this, because we don't have access to your directory. It's unique to your setup and
configuration, somehow.Alternatively, just use Windows instead. It will be significantly cheaper in the long-run (I suspect we've already burned through a lifetime's worth of licensing fees diagnosing this problem). Microsoft is still years away from even having the support infrastructure to help their customers with Linux problems, so any time there's a slight problem on Microsoft's end (SQL Server, .NET Core) , it will be "DIY" -- which really means, spend a lot of your time fixing quirks on their software.
-
RE: new docs: good! broken links: bad!
@mcascone thanks for the heads up! It was a quite a big migration effort, but ultimately will be a lot easier to maintain.
Looks like the UPack and Romp documentation got missed during the migration :( We will add it to the new site ASAP
-
RE: Cannot upload deb file to feed
Hello; can you share this file to us?
Feel free to email it to support at inedo dot com, but please add
[QA-586]
to the subject, so we can track this internally. -
RE: PGSCAN Utility Questions
@arozanski_1087 no problem!
And by the way, the
pgscan
tool is open source, so if you see opportunities to improve it, or want to devleop something on your own, please don't hesitate to use the sources - https://github.com/Inedo/pgscan -
RE: ProGet installation issue without any logs
@coskun_0070 glad you were able to narrow this down some more
There's shouldn't be a problem using remote SQL Server instances with the Inedo Hub, and this is definitely a scenario we test and support. If you were to put in a bogus connection string, it should make an error during the "Validating Configuration..." step.
My new guess is that something is blocking the connection later down the line, which is causing the unexpected edge case? I don't know...
-
RE: OTTER 3 - Variable at environnement level not found
@philippe-camelio_3885 I had a chance to look at this a little more closely, but just can't get this to reproduce as a problem. At first, I want to point out that no variable is defined on INTEGRATION.
"environments": [ { "name": "INTEGRATION", "variables": {} }, { "name": "PRODUCTION", "variables": {} }, { "name": "DEVELOPPEMENT", "variables": { "SQLInstances": "@(BOFRCT, ITRRCT, XTRRCT)" } } ],
Anyways, I imported your infrastructure (converted everything to
localServer
), deleted the variable from the role and added to the INTEGRATION.On VM008004 (in DEV), I got the expected error: Could not resolve variable %SQLInstanceParams. If I delete the variable (SQLInstances) on the server, that variable is resolved because it's defined in DEV, too.
OP VM008007 (in INT), I got the (different) expected error: Key StaticPorts not present in the map.
When I delete @SQLInstances (which is defined on the server), then I get the error "Could not resolve variable @SQLInstances." ecause it's not defined on the environment
-
RE: ProGet installation issue without any logs
@coskun_0070
that's really strange.
Unfortunately, I don't know what we can do from here. I know it's frustrating to hear that, but you're currently the only one experiencing this, and it's only on your server...
Perhaps it's failing while accessing the local package registry? It's a guess...
-
RE: ProGet installation issue without any logs
@coskun_0070 unfortunately (or fortunately) we only have one user who's experiencing this (you), and we can't invest the one-on-one time to help further.
Can you try the ProcMon route? If you monitor events from the InedoHub process, you'll see files downloaded from proget.inedo.com to a temp folder, and then extracted. If you can monitor what happens to those extracted files, it'll will give a you a clue as to what's deleting those files.
-
RE: ProGet installation issue without any logs
@coskun_0070 unfortunately there's no other code we can add
It's very clear to us where the problem is occurring; the package files (zip files) are extracted to a folder. No error occurs during the extraction. And then a moment later, the files are not there.
Can you use a tool like ProcMon to see all that's happening behind the scenes? You may have something else interfering with the files after extraction
https://docs.microsoft.com/en-us/sysinternals/downloads/procmon
-
RE: Otter 3 - Error This operation is only valid when run against an SSH agent.
FYI; we're going to investigate this some more, and will try to see why it's happeing here, but not for me
-
RE: PGSCAN Utility Questions
Hi @arozanski_1087,
Whatever
consumer-package-name
andconsumer-package-source
you set as a parameter topgscan
is what the packages will be associated with.Basically, the utility has the effect of doing this...
- Reading the sln file you specify
- Finding all NuGet packages
- Locating those packages in ProGet
- Adding a "package consumer" to the NuGet package with the "consumer" information (MyProject?).
To get the behavior you want, you may need to call
pgscan
multiple times with different "consumer" information, or even modify the tool / cusotmize something that calls the API directly.Good point about delete button; I added a change (PG-1957) where we'll get that as a UI Addition.
-
RE: ProGet Free license violations detected
Hello,
This typically comes from having one ProGet Free instance connect to another ProGet instance.
You can clear the license violations here: https://docs.inedo.com/docs/proget/administration/license
Cheers,
Alana -
RE: Otter 3 - Error This operation is only valid when run against an SSH agent.
Hello,
I'm not able to reproduce this case at all
The message "This operation is only valid when run against an SSH agent." is logged if you try to run
SHEXec
against a non-linux server, and I can consistently reproduce that.But when I switch to an SSH server, it works totally fine...
I wonder if there's something simpler at play, like the wrong
$ServerName
is in context or something? Nothing seems thtat way from the codes you shared, but.... can you try a very simple repro case, like an OtterScript that looks like....for server myLinuxServer { SHExec echo hello world; }
Thanks,
Alana -
RE: PGSCAN Utility Questions
Hi @arozanski_1087 , happy to help!
Hopefully we can update the documentation with these improvements.
What actually belongs in the consumer-package-name and consumer-package-version fields?
This is supposed to represent your application.
pgscan
will detect the packages that your application uses, likeNewtonsoft.JSON
and the version of that package.How does pgscan handle subprojects and submodules when I call it from the .sln level?
When you point
pgscan
to a.sln
file, it will parse the file and look for projects. Under each project, the tool will look forpackages.config
(which is the older style project format) and thenproject.assets.json
(which is the newer style).How do I remove dependencies from packages once I register them?
This isn't currently supported it seems (I don't see a delete button in the UI), but if you don't mind going to the database, you can just do
DELETE [PackageDependents]
and then all the rows are cleared. -
RE: [Proget] HTML Table on Package Dependencies Page has table layout issues
Hello;
Thanks for reporting this bug/layout issue! I just made a simple change (PG-1956) to fix this, and it will be available in the next maintenance release (5.3.29)
-
RE: ProGet installation issue without any logs
Hello;
Basically, this is failing very early on during the installation process, during the "package extraction" process.
This would most likely be caused by only one of two things:
- disk is full; the packages are extracted to a temporary directory, so all drives should have at least 1GB just to be totally safe
- anti-virus is quarantining recently written files to disk
It might also be related to temporary file locking, so try rebooting to see if it helps.
Otherwise, check what could be preventing those package files to be extracted; it's typically the quarantine, so check the log files for that.
Please let us know what you find!
FYI; in https://forums.inedo.com/topic/3088/ the user said they disabled Windows Defender and it worked
-
RE: Large Chocolatey package upload is failing
@joshuagilman_1054 that's really large chocolatey file (nuget package), so you may want to rethink your approach. It'll cause some pain across the board, as you try to download and install that file as well. Instead, perhaps have your chocolatey package download an asset that you've stored in ProGet instead?
In general, large files are tricky to publish over a single HTTP request reliably. This is across the board, even when uploading files to places like Amazon S3; those rely on a chunked uploading process... but the NuGet API doesn't support that.
Otherwise, there's no limit imposed by ProGet itself, and you've found the settings that ASP.NET imposes. There could be some other limitation happening, but it's hard to say where; apparently it varies by operating system version, and it might even be middleware (like a proxy/firewall).
The message "there must be exactly one package" is unexpected; I would instead expect 'request length exceed". In any case, that message just means that no valid files were attached to the request, which can happen if it was suddenly cut off.
All told, when it comes to really large files (even asset directories), a Drop Path approach may be easiest to use.
-
RE: Differences between sdk classes RemoteExecuteOperation, ExecuteOperation and RemoteJob
An
ExecuteOperation
is the most simple Operation class available. You just implement theExecuteAsync
method, and that code by the execution engine when the Operation is invoked in your OtterScript. This (and all operations) have aExecuteCommandLineAsync
helper method, which is sent to the server in context.To interact with the server in context, you need to use the
Agent
property on theexecutionContext
that's passed into theExecuteAsync
method. Because there are a lot of agent types and versions (Inedo Agent, Inedo Agent on Linux, PowerShell Agent, SSH Agent, Local Agent, etc.), you can use theTryGetService
method to see if the agent supports what you want to do. Not all agents support all services. I think you've already seen how this works.One of the agent services available is
IRemoteJobExecuter
. This essentially just performs a long-running task on the remote server, via the agent. For this service to be supported, the agent must support .NET; I think all agents do at this point (even SSH) thanks to .NET core.A
RemoteJob
is the class used to describe what this long-running task is. It contains information about what you want to do, has its ownExecuteAsync
method that will run on the server, and can stream log messages back to BuildMaster/Otter. When defining aRemoteJob
, you need to serialize/deserialize everything on your own.For example, if your job simply wanted to add two numbers together, you'd need to
Serialize
the two numbers, thenDeserialize
them, then serialize a response, and deserialize the response. It's a bit complex.This is where the
RemoteExecutionOperation
comes in.It has a "lifecycle"of three methods:
BeforeRemoteExecuteAsync
(optional, happens on BuildMaster/Otter server)RemoteExecuteAsync
(required, executes on remote server)AfterRemoteExecuteAsync
(optional, happens on BuildMaster/Otter server)
Hope all this helps!
-
RE: An error occurred in the web application: Server cannot set content type after HTTP headers have been sent.
That message is basically a result of a bug in error-handling logic; basically, an error is occurring while displaying the error. This can happen due to certain IIS or server settings, and in later version, you should see a more appropriate message.
It's hard to say what the problem is, but if you didn't change anything, I would just reboot, and the problem might go away. Changing App Pool settings can also help (like Classic -> Integrated or viceversa).
You can also try upgrading to see the underlying message. v4.6 is pretty old anyways.
-
RE: OTTER 3 - Variable at environnement level not found
@philippe-camelio_3885 thanks!
How many environments are that server in? Just one environment, or multiple? Can you try it with just one environment, if mutliple?
If it still doesn't work, then I'd like to try to reproduce it. Can you share your infrastructure json file (Admin > Export Infrastructure? If it's sensitive info, then don't worry -- you can just send to support at inedo dot com (instead of public post), with [QA-568] in the subject? I can fish it out of the box then, and attach to our internal tracker.
-
RE: 500 Internal Server Error when pushing docker image
Thanks @Stephen-Schaff! FYI we are currently investigating this, and I hope to hear back soon. I'm starting think that maybe this is the issue, but I'd like to confirm and fix it first...
-
RE: OTTER 3 - Variable at environnement level not found
Hmmm... I did a quick test for environment-scoped variable resolution, and it seems to work fine for me, but it seems like it can get a bit complex with nested environments and multiple so there might be a bug
Is "INTEGRATION" a nested environment? How many environments is your server in?
Thanks!
-
RE: OTTER 3 - Help needed on PSEnsure (works on Plan but not on Configuration)
@philippe-camelio_3885 thanks for sharing all the logs!
Can you confirm... which version of the Scripting extension are you using? I see @apxltd made a pre-release version (1.10.2-rc.3) but it's not yet released.
-
RE: [Otter 3] Upgrade Inedo Agent failed
Just a guess, but v46 required you to explicitly set an Instance Name in the product, and I'm guessing you didn't do that. This can cause some strange behaviors, and maybe that's what happened here.
From here, I recommend to just reinstall v49 on the server, manually. Alternatively, you could upgrade to v49 first (you can manually enter a URL in Otter, for the installation package).
FYI: in v49, instancing is automatic, based on IP address.
-
RE: Setting runtime variable from powershell script
@ashah_4271 can you share your OtterScript? and show specifically what you'd like to do? Happy to help if we can!
-
RE: 500 Internal Server Error when pushing docker image
Hi @Stephen-Schaff,
An error like that should be better reported by Docker (that XML is their format), but it also should have appeared in the diagnostic center, since it's an unexpected server error. It's likely suppressed by mistake, for expected errors like "tag not found" and the like.
As for why the error occurred, it's hard to say -- but definitely a bug. Do both of your feeds have the same storage type? Under Manage Feed > Storage?
There is some complexity with "common blog storage", so it's possibly related to that. If you can share info, we can try to repro and then fix.
@viceice thanks for reporting that. It's unrelated. The error message I see (in Admin > Diagnostic Center) is implying that the problem is related to our database structure.... perhaps from installing a pre-release version? We use fairly unstable versions for our internal production environments :)
-
RE: ProGet-Server sporadically returns no packages
Hi @m-janssen_0802 ,
We've identified a potential fix for this (PG-1942), and have already pushed a code change as pre-release version
5.3.27-rc.15
.This is installable via the Inedo Hub when you configure the installation source as
https://proget.inedo.com/upack/PrereleaseProducts/
- this can be changed with the little[config]
button in the Inedo Hub, bottom left-corner.Thank
Alana -
RE: ProGet Handling of PowerShell Gallery Versions
It's strange indeed. I wonder if it's related to a locally cached version you have?
I can't seem to find any documentation, but I believe PowerShell expects to only receive the latest version available unless a specific version is specified(This is based on the fact that if more than one package is found, it errors). Looking further into ProGet returning two results, we see that it is returning the latest non local version as well as the latest local version. I think if ProGet were configured to only return the latest version that that would work better
Behind the scenes, PowerShell is calling the
FindPackagesById()
NuGet API method which is supposed to return all versions.You should see almost identical results to:
- https://www.powershellgallery.com/api/v2/FindPackagesById()?id='ExchangeOnlineManagement'
- https://proget.company.com/nuget/PowerShell/FindPackagesById()?id='ExchangeOnlineManagement'
You'll notice the same with
InvokeBuild
as well; all versions are returned from the API, as expected. So this is why I think there's multiple repositories (a local one?) or caching, or some other thing with the client. I know you can install a module locally, in powershell, by copying the module file to a directory?Sorry I'm not really good with degubbing the PowerShellGet client, but hopefully we can figure it out....
Cheers,
Alana -
RE: ProGet Handling of PowerShell Gallery Versions
Hi @Michael-poutre_3915 !
I'm not a PowerShell expert by any means, but there's nothing that's "quirky" about the
ExchangeOnlineManagement
package or data, so I don't think that's the problem.What you're doing should work fine, however. It seems very basic usage, and it's just odd to see
WARNINGS
about the module being "matched"...I searched the error "Unable to install, multiple modules matched " and found lots of results
- https://blogs.msmvps.com/richardsiddaway/category/powershellget/
- https://stackoverflow.com/questions/63612214/unable-to-install-az-modules
Lots of suggestions, but this error seems to happen if there are repositories configured.... and apparently it's possible to have multiple repositories named
ProGet
?So I'm wondering if this is ultimately some sort of client configuration quirk/bug?
-
RE: Feature Request - ProGet - Update vulnerability list if a package is not available in any feed
Hi @harald-somnes-hanssen_2204,
Unfortunately this got pretty complex, pretty quick due to the way these are stored (in ranges, per package type).
In order to clean-up vulnerabilities, we will need to scan all feeds of that type, then it's packages, and then do version comparisons. So it would need to be a separate Vulnerability Cleanup job (as you originally suggested), and not a retention rule.
It's unfortunately not an easy engineering task on our end, especially since it could be quite resource intensive, depending on number of vulnerabilities / packages.
All the mass-clicking seems to be really annoying for sure, but we just need to evaluate the engineering costs/effort of this feature, with the benefits alternatives. For now, we should wait to see if anyone else expresses interest or issues with managing vulnerabilities.
I want to also note that... we do have a project on the horizon for a kind of multi-selecting UI table (where you could do bulk operations on selected items), and perhaps that would help here instead
-
RE: Promotion API not working for Docker Feeds
Hi @Stephen-Schaff ,
what you're doing should work, but this appears to be a bug due to the special-handling required for docker image promotion via API - so I've logged this as PG-1939 - we should get it fix in the next maintenance release.
-
RE: [Otter 3] Upgrade Inedo Agent failed
@philippe-camelio_3885 a failure there is rather peculiar, and the error message just looks like a generic "can't connect" error. The agent upgrade hasn't really even started yet, it's just some initial prep of downloading the file.
That message is occurring when making a very basic "direct connection" to the Inedo Agent (as opposed to the more complicated, "pass through" connection to the Otter Agent that the Inedo Agent manages).
The other time a "direct connection" is made is during the "Server Checker" task runner. That runs on service startup, and then every hour, or when you trigger it manually (Admin > Service).
Anyway you could check it there. So, it could just be bad timing? Maybe the server is actually inactive, or agent got off?
-
RE: OTTER 3 - $CredentialProperty not working after migration from Otter 2.X
Ah ok, I was able to reproduce this behavior; it was unfortunately a 3.0 regression as part of migrating the Legacy ResourceCredentials changes... but it's relatively easy to fix as OT-413 -- it's already scheduled for Otter 3.0.5 (next Friday), but we could make it available as a patch/pre-release version if you'd prefer?
-
RE: RPM push error - A 400 error occurred in rpm-dev: Unable to parse package header. The supplied package may be an invalid RPM file.
You're right, there haven't been any RPM changes... the error message is implying that ProGet is rejecting the data you're sending for some reason, "Unable to parse package header. The supplied package may be an invalid RPM file".
It might be a .NET5 issue, just based on the version numbers you told me. Since ProGet v5.3.12, the
inedo/proget
image is hosted using the .NET Core runtime. Previously, it was hosted using the Mono runtime. Theinedo/progetmono
image is available to v5.3.19.Cheers,
Alana -
RE: Docker URL for a specific feed
Hi @Stephen-Schaff,
Docker is pretty wonky, and technically doesn't support multiple registries per host.
A Docker Registry is tied to a host (like
myproget.mydomain.net
), not a URL. A Docker Registry's catalog API endpoint is always«host-name»/v2/_catalog
, but I guess your third-party tool isn't so particular, and does a basic concatenation against the field.This silly implementation makes no sense for ProGet -- so we just work-around that by requiring a "Namespace" on your Docker Registries that starts with the feed name.
So technically, you have a Docker Registry at
«host-name»
that hosts a bunch of Docker Repositories named«feed-name»/«repo-name»
. I'm afraid your only option here is to do what Docker wants you to -- which is create a user with restricted permissions to the containers you want. -
RE: OTTER 3 - $CredentialProperty not working after migration from Otter 2.X
@philippe-camelio_3885 sorry this has been frustrating
There is a compatibility shim for this that should have picked this up... but after the migration from
ResourceCredentials
toSecureResources
andSecureCredentials
, we no longer use the type name to qualify credentials.Long story short, this should fix it...
set $CredentialUser = $CredentialProperty(myaccount, Username); set $CredentialPwd = $CredentialProperty(myaccount, Password);
-
RE: Return 200 intead 404 when package does not exist
@afd-compras_2365 ProGet implements the NuGet Server API, so you can use ProGet as a private NuGet server; we did not design that API, we simply implement it
Here is the same API call on NuGet gallery with a package that doesn't exist
https://www.nuget.org/api/v2/FindPackagesById()?id='NotRealPackageFakeFAke'&semVerLevel=2.0.0
I'm not sure I understand the dependency question with nuget.exe though... but in a case like this, you can just put those packages in your feeds I believe? Then you won't get update issues
-
RE: Azure Blob error when upload PyPi package
@brett-polivka sorry about that, there's always room to improve the testing, and you just got "unlucky" with a few edge cases that slipped through the cracks.
But not to worry -- you can just downgrade the extension by following the manual installation instructions - the 1.9.0 of Azure will still use the old libraries - https://proget.inedo.com/feeds/Extensions/inedox/Azure/1.9.0
-
RE: Return 200 intead 404 when package does not exist
@afd-compras_2365 that's the proper behavior of the
/FindPackagesById()
API endpoint; as far as why that's the proper behavior, you'd probably have to track down the folks at Microsoft who designed the API over ten years ago and ask them ;)That particular API returns a resultset; if the set has 0 items, then there are no packages with that ID found.