Navigation

    Inedo Community Forums

    Forums

    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    1. Home
    2. stevedennis
    • Profile
    • Following
    • Followers
    • Topics
    • Posts
    • Best
    • Groups

    stevedennis

    @stevedennis

    inedo-engineer

    27
    Reputation
    472
    Posts
    36
    Profile views
    0
    Followers
    0
    Following
    Joined Last Online

    stevedennis Follow
    inedo-engineer administrators

    Best posts made by stevedennis

    • RE: Proget: delete all versions of a package via API

      Hi @mcascone ,

      We don't have a single API method that can be used to delete all package versions from the API, but the foreach loop will do the trick!

      I should add that I am doing this as the first stab at an attempt to automatically delete packages from a development feed, when the corresponding branch in github is deleted

      I don't know the specifics/details of your use-case, but based on what I read, I'd recommend these guidelines:

      • assuming: one GitHub repository, one project, one package you want to release
      • use the same package name/group for all packages you create for this project, regardless of branch or development status
      • create your "dev" packages using a prerelease version number, that has a sort of -ci.## version (assuming you use CI to build packages)
      • embed the commit id and branch in your upack metadata file, for traceability
      • if you want to see which branch the packages was created from using the version number alone, add a +branch metadata label to the version number for branches (don't do this for master)
      • use repackaging and promotion to take your -ci packages to -rc to stable (and the desired feed)
      • let retention policies automatically cleanup up the -ci packages
      posted in Support
      stevedennis
      stevedennis
    • RE: No option for NuGet package path under Advanced Settings

      Hi @kichikawa_2913 ,

      I think it's this way for "historic reasons" - mostly all the other feed types came later, and it seems no one ever changes these paths or noticed.

      Easy enough to make it configurable, but can you share your use case? Why do you want to use something other than a single root path with all of your packages?

      Anyway I added a feature for this, and we should be able to get it in the next maintenance release PG-2006

      Cheers,

      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Marking packages as deprecated

      Hi @benjamin-soddy_9591,

      No problem "resurrecting" topics! We definitely want to hear from users about feedback/feature requests.

      We still haven't had anyone else ask for deprecation since this request, but I wonder if there's a better solution to solving your challenges than this feature. It sounds like you want to increase governance of your NuGet Packages, potentially with some sort of compliance in mind.

      The dotnet list package --vulnerable is probably not what you want for your organization; NuGet's Built-in Vulnerability Scanning is really limited, in part because it only reports on a fraction of known package vulnerabilities (164 as of today). It also won't block packages that you deem problematic, unlike ProGet's feature.

      The same is true with dotnet list package --outdated -- it's probably not what you want, because it relies on developers to have to know (1) to run the command, and (2) know what to do if there's an outdated dependency.

      There are better ways to manage third-party packages (see How to Create a Package Approval Workflow for NuGet), and you'd better served knowing who's consuming outdated packages (see Use Package Consumers to Track Dependencies

      Just some thoughts; like I said, we haven't had any demand for this feature, but these are proven solutions for improving governance of packages as organizations grow/expand their NuGet usage like you are.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Permissions only work when set for specific user, not a group (LDAP)

      Hi @kichikawa_2913 ,

      The NuGet client's behavior is based on NuGet.org, where no authentication is ever required to view/download packages. As such, it doesn't pass the API key when doing those queries; instead, you can use a username of api and the password of your api key.

      Based on the issue though, it sounds like ProGet is unable resolve the groups; I would use the "test privileges" function on the Tasks page to verify this. Thatw ill show you if the username can download packages or not.

      The most common reason that groups aren't resolving is that the member is not directly in the group (i.e. they're in a group which is a member of the group), and you don't have recursive groups enabled; do note that this is really slow on some domains.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: upack repack doesn't use complete version string from CLI

      Hi @mcascone ,

      Just looking at the code real quick, I suspect we have a bug where it writes out the wrong files name for the new package:
      https://github.com/Inedo/upack/blob/master/src/upack/Repack.cs#L120

      That's probably an easy fix, which we can do as part of this Q&A item. I'll wait to hear back about this one.

      As for the error, "The underlying connection was closed: An unexpected error occurred on a send.", that sounds like it's HTTPS related. Could you attach Fiddler, or something like that, to find out what's happening under the hood? We may be able to error message to better report it if so.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Mixing ProGet Instances

      Hi @cimen-eray1_6870 ,

      Great questions; there's no problem having a second instance with ProGet Free Edition.

      The relevant restriction is that you can't use a Connector in ProGet Free Edition to connect to another instance of ProGet (either another Free Edition or your paid edition).

      Hopefully you can use your Maven feed as a proof of concept for implementing it in the main instance. Good luck!

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Support for Rust Cargo packages

      Hi @brett-polivka,

      I've added it to our Other Feed Types page, and linked this as the official discussion thread.

      There's a lot of things to consider in developing a new feed type, but ultimately it all comes down to two things: (1) how much more value does this feature bring to our users, and (2) how many new licenses of ProGet would this feature sell.

      The second question is where internal market research comes in, but we would love your opinion on the first question.

      Here's a nice and simple way to help understand value: how much more do you suppose your company/organization would pay for this feature if it were available as a hypothetical add-on? $100/year? $1,000/year? $10,000/year? Etc. And why? What time is it saving, risk is it mitigating, etc.

      The second part of the value equation is how much effort will it take, technically speaking. It's more than 15 minutes obviously, but is it 10 hours? 100 hours? Etc.

      On the plus side, the package format seems to be documented pretty well. However, the registry API has a huge red flag:

      The index key should be a URL to a git repository with the registry's index.

      Does this mean their API is Git-based, and we'd need to first add private Git repository hosting to ProGet? And did they test it with private/authenticated Git repositories, or just their public (probably GitHub) repository? 🙄

      posted in Support
      stevedennis
      stevedennis
    • RE: Debian feed mirror Performance

      @stefan-hakansson_8938 as you noticed, ProGet's Debian connectors are not currently designed to handle the gigantic, operating-system mirrors very well. This is because they are always refreshed "on demand" - which is what you want for CI/CD workflows.

      It's not great for public repository mirroring, however. In Q4 or later, we will explore adding an option to do periodic updates.

      posted in Support
      stevedennis
      stevedennis
    • RE: Proget - Can't use the native API even with an API Key with Native API access

      Hi @m-webster_0049 ,

      The first thing I would try is to troubleshoot this is to switch to a very basic API key like hello. That just eliminates any typos, spacing, etc.

      Next, I would try specifying the API Key via X-ApiKey header (see docs) - just to see if you get a different error. It's possible there is a regression somewhere.

      Best,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: How to set content type of asset with API?

      @joshuagilman_1054 this is currently planned for 5.3.27 as PG-1934 (April 17) - we'll let you know if plans change!

      posted in Support
      stevedennis
      stevedennis

    Latest posts made by stevedennis

    • RE: Support for NotAutomatic/ButAutomaticUpgrades headers in Debian feed Release files

      Hi @geraldizo_0690,

      Thanks for the pointers -- now as an FYI, these settings would have to be a Feed-level setting, but the drop-downs would be the same.

      FYI, here's the code we use to generate the Release file --- I'm not sure what those other header values do, but we probably wouold just want to add the two you suggested.

      What do you think?

      I suspect this will be a quick, opt-in change!

      private void WriteReleaseFile(Stream output)
      {
          using var writer = new StreamWriter(output, InedoLib.UTF8Encoding, leaveOpen: true) { NewLine = "\n" };
          writer.WriteLine($"Suite: {this.Distro}");
          writer.WriteLine($"Codename: {this.Distro}");
          writer.WriteLine(FormattableString.Invariant($"Date: {this.Generated:ddd', 'dd' 'MMM' 'yyyy' 'HH':'mm':'ss' UTC'}"));
      // NotAutomatic: yes  <-- add here
      // ButAutomaticUpgrades: yes <-- add here
          writer.WriteLine($"Architectures: {string.Join(' ', this.indexes.Select(i => i.Architecture).Distinct(StringComparer.OrdinalIgnoreCase))}");
          writer.WriteLine($"Components: {string.Join(' ', this.indexes.Select(i => i.Component).Distinct(StringComparer.OrdinalIgnoreCase))}");
      
          var desc = FeedCache.GetFeed(this.feedId)?.Feed_Description;
          if (!string.IsNullOrWhiteSpace(desc))
              writer.WriteLine($"Description: {desc.ReplaceLineEndings(" ")}");
      
          writeHashes("MD5Sum:", i => i.MD5);
          writeHashes("SHA1:", i => i.SHA1);
          writeHashes("SHA256:", i => i.SHA256);
          writeHashes("SHA512:", i => i.SHA512);
      
          void writeHashes(string name, Func<IndexHashData, byte[]> getHash)
          {
              writer.WriteLine(name);
              foreach (var i in this.indexes)
              {
                  writer.WriteLine($" {Convert.ToHexString(getHash(i.Uncompressed)).ToLowerInvariant()} {i.Uncompressed.Length,16} {i.Component}/binary-{i.Architecture}/Packages")
                  writer.WriteLine($" {Convert.ToHexString(getHash(i.GZip)).ToLowerInvariant()} {i.GZip.Length,16} {i.Component}/binary-{i.Architecture}/Packages.gz");
              }
          }
      }
      
      posted in Support
      stevedennis
      stevedennis
    • RE: Bitbucket authentication issues

      Hi @brandon_owensby_2976 ,

      It sounds like you're on the right rack with troubleshooting; the issue is definitely on the server-side in this case, so I asked ChatGPT. Who knows if any of this is accurate, but...

      This is a very common situation with older versions of Bitbucket Server (especially pre-6.x / pre-7.x era, but even up to some 7.x versions in certain setups).

      The REST API (e.g. /rest/api/1.0/...) and the Git Smart HTTP protocol (/scm/.../info/refs, /git-upload-pack, etc.) are handled by different authentication filters in Bitbucket Server.

      Most likely you're using a Personal Access Token / HTTP Access Token (most frequent cause in older versions). In many Bitbucket Server versions (especially ≤ 7.17–7.21), HTTP access tokens were designed mainly for REST API and did not work reliably (or at all) for Git over HTTPS in many cases.

      As a workaround , you need to use a real username + password (or username + app password if 2FA is on) for Git operations

      We've seen similar in really old version of ADO, GitHub, etc, where API tokens wouldn't work for Git.

      Anyway, I would try that - at least from the curl side of things. And maybe upgrading will help as well. If it works, then you'll likely only be able to use a Generic Git repository with a real username/password -- and just create a special builds user which effectiveely acts like an APi key.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Suggestion: Show total container image size

      Hi @Stephen-Schaff,

      It seems pretty easy to add these up and display them on the screen! I suppose the "hard part" is the UI...

      A "Total" line doesn't seem to look right. And it seems like too little information to put in one of those info-boxes. "Total Size: XXXX MB" at the bottom just looks incomplete.

      Any suggestions? I'm struggling a bit to see how it could be displayed without looking a little out of place... and since it was your idea I figured I'd ask ;)

      Thanks,

      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Support for NotAutomatic/ButAutomaticUpgrades headers in Debian feed Release files

      Hi @geraldizo_0690,

      Hello,

      I'm not all that familiar with Debian/APT... but I briefly researched this, and it seems like this involves adding values like this at the top of the Release file like this:

      NotAutomatic: yes 
      ButAutomaticUpgrades: yes
      

      Is that it really? And this setting would impact the entire feed... but have no real relation/impact to connectors or packages?

      If that's the case, how would you envision configuring this? I'm thinking on the Feed Properties page, but perhaps as a checkbox? How do other products/tools do it in your experience?

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Bitbucket authentication issues

      Hi @brandon_owensby_2976 ,

      Thanks for the feedback. Based on what you described, it sounds like...

      • BitBucket Server API is working fine
      • Git API is not working due to a failed/failing authentication challenge

      You were able to confirm this with the "Generic Git Repository" also not working. If you were to do a curl -I -u USERNAME;APIKEY https:/.../.git you would most certainly get a 401 response as well.

      Anyway that's where I would start -- try to figure out why the Git API is not accepting the credentials. It's most likely related to permissions on theh key, but it's really hard to say... just a guess.

      Thankks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Migrate Feed from C drive to another drive

      Hi @Julian-huebner_9077 ,

      Here's some information on file storage paths:
      https://docs.inedo.com/docs/proget/feeds/feed-overview/proget-feed-storage

      Long story short, if you modify Storage.PackagesRootPath under Admin > Advanced Settings and move your files as needed, then it should work just fine.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: BuildMaster fails to return TeamCity build configs

      Hi @kquinn_2909 ,

      We haven't forgotten about this; the issue is trying to figure out steps to reproduce it based on the information we have... considering it works on our test instance and all. We may consider putting some more debugging code in, though figuring out how to expose that in this context is a little challenging.

      Just as a sanity check though, do you have a project that doesn't have a "space" in the name? I want to make sure this isn't something really simple as WebProjects%20Replicator vs WebProjects_Replicator.

      The other idea is authentication/authorization, though I would imagine you would get an error accessing the project instead of no builds.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: [ProGet] Recently Published Rule

      Hi @jstrassburg_8563,

      if the resolved version that npm i underscore chose was released in the blocking period, the npm command would 400?

      If you have "Block Noncompliant Packages" enabled (which we generally don't recommend) and you have a rule that new packages are complaint, then the npm command would most certainly give some kind error.

      You will probably see a 400 code, but I don't think it will display the message that's sent by ProGet (i.e. "package blocked due to...")? The real issue comes with a large dependency tree, and it'll be hard to know what exactly the issue is.

      As such, we recommend running pgutil builds scan/audit in your CI/CD pipelines instead of blocking. This will produce a much easier to understand report, and even allow you to bypass issues reported on a case-by-case basis.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Helm connector with relative url

      Hi @sigurd-hansen_7559 ,

      Thanks for pointing me to the repository; I was able to reproduce this, and it will be fixed via PG-3194 in the next maintenance release (scheduled for Friday).

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Not able to download Docker images that doesn't have / in it

      Hi @toseb82171_2602,

      In Docker, images must have a namespace. When they don't, the Docker client will transparently append library/ to those namespaces. In general, this behavior is not desirable, and it's recommended to use library/python or, in a ProGet context, myproget.corp/mydockerfeed/library/python.

      That said, there is a setting on the connector that may help resolves such images, but these images can be problematic even with that.

      AS for extending the trial, no problem - you can actually do this yourself on my.inedo.com, on the day of expiry. Of course, please contact us if you run into any issues or have licensing questions.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis