Navigation

    Inedo Community Forums

    Forums

    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    1. Home
    2. stevedennis
    3. Posts
    • Profile
    • Following
    • Followers
    • Topics
    • Posts
    • Best
    • Groups

    Posts made by stevedennis

    • RE: Unable to GET from connector "nuget.org"; using cached copy.

      Hi @parthu-reddy,

      If you (the NuGet client) requests the NewtonSoft.Json-13.0.0 package, and ProGet already has that package cached , then ProGet will return the package without ever contacting NuGet.org.

      However, the NuGet client will also request a list "all versions of NewtonSoft.Json when doing a package restore. I don't know why, that's just what it does.

      In this case, ProGet will contact the connector (NuGet.org) and aggregate the remote/local results. If the connector is unreachable (as in the case above), ProGet will log a warning and instead return a list of "all (cached) versions of NewtonSoft.Json`... which will be probably all that's needed.

      This is likely why no one has complained/noticed and jobs aren't failing.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet 2024: SCA & Build Restrictions

      Hi @caterina ,

      In ProGet 2024+, the Project Analyzer will give some kind of warning/error and builds 1001+ will be in an "inconclusive" state instead of "Compliant", "Warn", or "Noncompliant"

      That said, ProGet 2024+ does include capabilities to auto-archive builds through use of status/pipelines, so that might be worth investigating.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Default Chocolatey feeds to v2 and v3 enabled

      hey @steviecoaster ,

      Great idea! This will be the default starting in the next maintenance release via PG-2965

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Unable to GET from connector "nuget.org"; using cached copy.

      Hi @parthu-reddy,

      This basically means that ProGet was unable to communicate with nuget.org for one reason or another. More specifically... ProGet didn't receive a timely response.

      Usually it's a temporary outage or network issue, and will hopefully go away and not cause any issues.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Import Artifacts from Artifactory 404 Error

      Hi @misael-esperanzate_5668,

      Currently, ProGet uses the nexus-maven-repository-index.gz file to see what artifacts are in a Maven repository, and then downloads the files based on that. So you'll need to enable that index file by doing this, I think:
      https://jfrog.com/help/r/jfrog-artifactory-documentation/maven-indexer

      This only applies to Maven indexes, and that file is only used for importing/downloading a feed into ProGet. For other feed types, a different API is used.

      Good news --- in an upcoming maintenance release of ProGet, we will be shifting to use the Artifactory API directly to import artifacts from a repository.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Block package from download

      Hi Paddy,

      Connector Filters allow/block packages by name (not version filters). The package name is FluentAssertions not FluentAssertions:>=8.0.0. So that's why it's not behaving as you expect.

      Please see Dealing with Fluent Assertions License Changes in ProGet to learn more how to address this particular issue.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Block package from download

      Hi @itadmin_9894 ,

      Can you give us a few more details of what you're trying to do? A connector filter is intended to allow or block a package by name; it does not filter out versions.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: 0 byte download when using pgutil assets/packages download in github workflow

      Hi @mmaharjan_0067 ,

      I'm afraid we're at a loss here; as you noticed, it works fine on the command line - but there's just something "off" about your runner.

      The only thing we can figure is some kind of network block or interference. We don't really have a way to test/debug this any further.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: OCI Support for Assets

      Thanks for the additional feed back @dan-brown_0128!

      We do have a thread about OCI Registries in ProGet, as they have been requested from time to time. It might be worth posting this there, too?

      But our current take is that "OCI Registries are a poorly-designed solution in search of a misunderstood problem", and that they are technologically inferior to alternatives. Here's a quote from that post:

      The main issue I have is that an OCI registry is tied to a hostname, not to a URL. This is not what users expect or want with ProGet -- we have feeds. Users want to proxy public content, promotion content across feeds, etc. None of this is possible in an OCI registry.

      We got Docker working as a feed by "hijacking" the repository namespace to contain a feed name. Helm charts don't have namespaces, so this is a no go.

      Personally I can't imagine how this is scalable. A lot of new Docker users (including me) are "shocked" that you have to include the "source" in the container name (e.g. proget.corp/my-docker-feed/my-group/my-app) -- I can't see how this could ever work to expand this to all deployment artifacts (e.g. my-artifacts.corp/my-group/my-app/service-assembly), especially considering how often everything is referenced in scripts and dependencies.

      Anyways best to continue on that other thread if you'd like to keep the discussion going, we're always open to learning more :)

      posted in Support
      stevedennis
      stevedennis
    • RE: Mirrored Replication

      Hi @james-woods_8996 ,

      You are correct - in a mirror scenario, either side could publish (or delete) packages, and it would be propagated to the other. Three-way is also possible, but you do that with two two-way relationships, if that makes sense?

      So it's basically A <--> B + B <--> C, and that means if you publish/delete a version on A it will propagate like A --> B --> C.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: OCI Support for Assets

      Hi @dan-brown_0128,

      I'm afraid that's not possible -- Asset Directories are essentially a web-based file system where as OCI Registries are basically Docker Registries. Based on the context of that screenshot, I'd be surprised if the OCI registry could be used for something other than a container image registry.

      OCI/Docker Registries can technically store any kind of large binary object, and does so using a 256-bit digest hash (think a double-guid). These "blobs" have no names or extensions - some are "tagged" to make them human readable/accessible. These tags are not only mutable, but can be anything you'd like (v2024.1 or purple).

      That said, OCI Registries are not a suitable storage for artifacts by any means. Packages are ideal (self-contained metadata) - but at least files have names/extensions that will give you a clue as to the provenance.

      That said, might I suggest a different approach for deployment to explore down the line, as you're looking to improve/optimize things:
      https://inedo.com/buildmaster/vs-octopus-deploy

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Get Package Policies using ProGetClient

      Hi @pmsensi ,

      Short of using the Native API, I'm afraid we don't have a first-class API to export/import package policies and their related information

      We may consider adding that after ProGet 2025 is released.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Get package license with ProGetClient

      Hi @pmsensi ,

      The pgutil builds audit command should show the same license information you see in the UI:
      https://docs.inedo.com/docs/proget/api/sca/builds/analyze

      The pgutil packages audit command should also provide similar information on a package level.

      Is that's what you're looking for?

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Upgrade to 2024.29 crashes

      Hi @v-makkenze_6348 ,

      This error was related to an error on our end - basically there was some bad data in the package that the InedoHub downloaded, and it caused the installer script to crash. We fixed the package on our end, so reinstalling normally should work now.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Otter server receives thousands of connections from agent after reboot

      Hi @jimbobmcgee ,

      I'm afraid we've decided to not implement OT-516 at this time; for this particular issue, we weren't comfortable with a "blind change" (i.e. coding without testing) because we kind of forgot how it all works (it's been years since it was implemented) and we don't have a readily-available testing environment for the listener/incoming agents. So it's a higher level of effort than we can put in for a community/free user.

      Anyway this will shift to our roadmap for Otter 2025, where we will dedicate time to make other improvements as well.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Creating a new PowerShell module

      Hi @steviecoaster ,

      That's great... sure go ahead publish, no objections on our end at all :)

      Actually, there is at least one PowerShell Module out there already that we occasionally point people to: https://www.powershellgallery.com/packages/ProGetAutomation

      That one's been around for a long while... certainly since before we had a lot of the API endpoints that you're using. I also don't think it does user provisioning, etc. So, it'd be nice to have alternatives to share.

      We're also happy to take a look, but we most definitely aren't PowerShell Module experts. So we can't give feedback on the code/architecture/structure. The Native API / StoredProcs are usually pretty stable, but with the ongoing Postgres development, we are doing some minor refactoring of the SQL Stored Procs to ensure parameter consistency.

      Cheers,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Proget: SCA Event Notifier not working

      Hi @caterina ,

      It looks like there was an error processing the template (i.e. what's on the customize webhook tab). Can you share that, and maybe we can spot it?

      It's likely a syntax error of some kind.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: NPM Package name case sensitivity

      Hi @pbspec2_5732 ,

      The script in the linked gist should fix the problem for you; it's not feasible/possible to try editing in the database directly due to the complexity of the model.

      https://gist.github.com/apxltd/351d328023c1c32852c30c335952fabb

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Vulnerability JQuery Proget 5.2.14

      Hi @r-vanmeurs_4680 ,

      This is a false positive and you can disregard it; ProGet 5.2 is not impacted by the vulnerability in JQuery for may reasons, including the fact that they vulnerable code is not used and, if it were, ProGet is protected on the server-side from such "attacks".

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Configure connectors for Debian2

      Hi @arkady-karasin_6391,

      What are you specifying for Distribution in the connector? When I tried these settings, it worked:
      7b83cd27-c9e0-44d8-93a8-761312c90bf0-image.png

      You need to specify one of the available distributions, which are listed here:
      http://archive.ubuntu.com/ubuntu/dists/

      Thanks,
      steve

      posted in Support
      stevedennis
      stevedennis
    • RE: pgutil usage

      Hi @caterina,

      We haven't thought of adding a --stage option to the pgutil builds scan command, but of course it's possible. I know that we are exploring other options like associating builds/projects with pipelines (i.e. a sequence of stages).

      So perhaps that would mean, pgutil builds scan --workflow=MyWorkflow, and then MyWorkflow would start/end in different stages.

      For now, we'd love to see how you utilize the stages. Maybe it means calling two commands temporarily, and we can revisit once we add new features, etc.?

      For the pgutil builds create command, the project must already exist, or you'll get that error. You can create or update a project using pgutil builds projects create.

      So basically, these command would probably do what you want?

      pgutil builds projects create --project=BuildStageTest
      pgutil builds create --build=1.0.0 --project=BuildStageTest --stage=DesiredStage
      

      Note that when using pgutil builds create, you can also specify a stage name, like in the example above.

      posted in Support
      stevedennis
      stevedennis
    • RE: Questions regarding ProGet Usage

      Hi @arunkrishnasaamy-balasundaram_4161 ,

      Thanks for clarifying!

      [1] The MavenIndex file is not at all required to download artifacts from a remote Maven Repository nor to see the latest version of artifacts. In ProGet, all this allows you to do is browse remote artifact files in the ProGet UI which typically isn't very helpful.

      [2] It's not possible to change this

      [3] ProGet does not have "group repositories", but uses feeds with connectors. The model is different, and feeds with connectors will often cache packages
      in a lot of organizations.

      [4] It's likely you will be unsuccessful in your ProGet configuration with a setup like this or at least give your users a big headache and lots of pain/confusion. This is considered an "old mindset" the for configuring artifact repositories that were based on "files and folders on a share drive" not packages.

      This "closed approach" will greatly slows down development, causes duplicate code, and lots of other problems. Modern development approaches do not use this level of highly-granular permission. Instead, they take a innersource model. You do not need to make everything available to everyone.

      However, less than 1% of your 2k projects will contain sensitive IP or data that other teams can't access - those projects should be segregated into sensitive feeds. The logic is, "if everything is sensitive, then nothing is sensitive"

      [5] ProGet does not generate a "support zip file"; if we require additional information when supporting users we ask for the specific information

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet Asset: downloaded installer is no longer executable

      Hi @uvonceumern_6611 ,

      Thanks for providing all of the additional information; based on what you shared, it looks like the file is actually being uploaded incorrectly... using a "multi-party / form upload encoding" instead of a basic PUT of POST of the body contents.

      Please see the Upload Asset File documentation for more information.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Questions regarding ProGet Usage

      Hi @arunkrishnasaamy-balasundaram_4161,

      I'll do my best to answer these!

      1. You can configure the FullMavenConnectorIndex job to run a routine basis under Admin > Scheduled tasks; it's not enabled by default because the MavenCentral Index is very large

      2. I'm not sure what "Context path" means?

      3. I'm not sure what you mean by "Group" repo?

      4. The way to handle this in ProGet is by using Policies & Rules to define noncompliant artifacts; certain versions of log4j would be considered noncompliant because it has that severe vulnerability

      5. We do not recommend 2000 feeds in any scenario; I wonder i there's a disconnect between what a "Feed" is and what you're looking for. A feed is place where you store all of the artifacts for a division/group of projects. The volume isn't a problem, but even a massive organization should have dozens of feeds at most

      6. This should show on the "History" page

      7. There is the "Packags" page at the top that can do some cross-feed searching

      8. Yes; see https://docs.inedo.com/docs/proget/administration/retention-rules

      9. I'm not sure what a support zip file is.

      10. In general you would follow the migration guides we've published; however, your existing artifact server may not be configured to allow importing. If you're running into issues, best to open a new thread on the forums and we can review/investigate

      11. Yes; see https://docs.inedo.com/docs/proget/replication-feed-mirroring/proget-howto-utilize-feed-replication-for-disaster-recovery

      12. Yes.; see https://docs.inedo.com/docs/installation/high-availability-load-balancing/high-availability-load-balancing

      Hope this helps point you in the right direction,

      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet v24 fails to initialize/upgrade the SQL Server database

      Hi @husterk_2844 ,

      I'm afraid we're at a loss here; no one else has reported any kind of errors like this, and I can't imagine what would even cause such a problem.

      I suspect there is something off about your Docker Compose file? That seems to be the only thing different than the basic setup.

      I would just try to re-follow the basic instructions we posted:
      https://docs.inedo.com/docs/installation/linux/docker-guide

      That's what we use to test, and lots of users install and upgrade without a problem.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet successfully started Event

      Hi @forbzie22_0253,

      There's no Windows event logged, but once the /heath page is reachable, then the application is ready. If you're using SQL Server and IIS on the same box, then both of those must first load before ProGet can start.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: License Usage Overview - Non-compliant Licenses in Use

      @v-makkenze_6348 whoops, good catch - yes thank you :)

      posted in Support
      stevedennis
      stevedennis
    • RE: License Usage Overview - Non-compliant Licenses in Use

      Thanks so much Valentijn!

      Looks like this was a display bug, and the code on Licenses Overview was looking at UsedByPackage_Count instead of UsedByBuilds_Count. Easy fix, which will ship via PG-2774 in next maintenance release:

      3010a04c-d79b-4be3-8344-28fa0c1e81df-image.png

      As an FYI, the package with GPL-2.0 is node-forge@1.3.1 in the VicreaNpmJs feed. Looking closer, that package is dual-licensed as BSD-3, so it's not really a problem.

      That said, the Licenses Overview page predates Policies, and I don't think the "License Usage Issues" makes a lot of sense anymore. The old model (block/allow) was much simpler with a basic Allow/Block rule. However, Policies are quite a bit more complicated.

      We're very open to ideas on what to do in its place, or if you have any suggestions on what could be improved in general in the SCA UI. It's very easy for us to "see" what you're talking about, since we have the backup now :)

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Max file upload

      Hi @russell_8876,

      ProGet itself does not have an upload limitation, so it's likely something else like your reverse proxy server, etc. Typically such large requests are blocked/prevented by middleware.

      That said, HTTP is not a reliable protocol and should never be used for such large requests. You will run into a lot of problems trying to upload 24GB files in a single request. You'll need a new approach.

      The easiest solution is to use drop paths, and then a file transfer protocol that is designed for large/reliable file transfers (most are).

      Another solution is to use "upload chunking", which only asset directories support; pgutil should handle the chunking/uploading for you. If you want to use packages, you can can then import that uploaded asset into a universal package feed:
      https://docs.inedo.com/docs/proget/upack/proget-api-universalfeed/proget-api-universalfeed-import

      -- Dean

      posted in Support
      stevedennis
      stevedennis
    • RE: License Usage Overview - Non-compliant Licenses in Use

      Hi @v-makkenze_6348,

      In theory, you should be able to find the noncompliant build on the Projects > Builds page, then narrow it down from there. But if you have a lot that are noncompliant, this may be difficult.

      You've sent us your database in the past; would you mind uploading a BAK again? We can take a look and improve the UX so this will be discoverable. You can use an old link that we sent you a while ago, or fill out a support ticket and we'll get you a new link.

      Let us know if you upload it - and we'll take a look!

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet - Exception: An error occurred in the web application: Nullable object must have a value.

      Hi @jw,

      I'm afraid I'm at at a loss then...

      It seems to be related to one of the following settings / issues:

      • Admin Banner Message that was configured on Admin section
      • When logged in as Admin...
        • New ProGet Version Detected
        • License Violation (i.e. using ProGet Free connectors)
        • License Problem/Expiry (your license key)
        • Extenison Load Error
      • NuGet v2 queries

      Based on the stack trace, I can't pinpoint which one it is, but I'll share the code - and maybe you'll see something.

      I would first try to pinpoint which it is by trying as Admin vs Non-Admin, and then see which of those issues it could be.

      private static List<Notification> GetNotificationsInternal(AhHttpContext context)
      {
          var notifications = new List<Notification>();
      
          if(!string.IsNullOrWhiteSpace(ProGetConfig.Web.AdminBannerMessage) && (ProGetConfig.Web.AdminBannerExpiry == null || DateTime.UtcNow < ProGetConfig.Web.AdminBannerExpiry))
          {
              notifications.Add(new Notification(
                  NotificationType.warning,
                  ProGetConfig.Web.AdminBannerMessage
              ));
          }
      
          if (WebUserContext.IsAuthorizedForTask(ProGetSecuredTask.Admin_ConfigureProGet))
          {
              if (ShowUpdateNotification(context))
              {
                  notifications.Add(new Notification(
                      NotificationType.update,
                      InfoBlock.Success(
                          new A(Localization.Global.UpdatesAvailable) { Href = UpdatesOverviewPage.BuildUrl(returnUrl: AhHttpContext.Current.Request.Url.PathAndQuery) }
                      )
                  ));
              }
      
              if (ShowLicenseViolationNotification(context, out var violationUrl))
              {
                  notifications.Add(new Notification(NotificationType.error,
                      new A(Localization.Global.LicenseViolation)
                      { Href = violationUrl }
                  ));
              }
      
              var expiresDate = Licensing.LicensingInformation.Current.LicenseKey?.ExpiresDate;
              if (expiresDate != null && expiresDate <= DateTime.Now)
              {
                  notifications.Add(new Notification(NotificationType.error,
                      new A(Localization.Global.KeyExpired)
                      { Href = LicensingOverviewPage.BuildUrl() }
                  ));
              }
              else if (Licensing.LicensingInformation.Current?.LicenseKey?.LicenseType == ProGetLicenseType.Trial)
              {
                  var days = (int)expiresDate.Value.Subtract(DateTime.Now).TotalDays;
                  notifications.Add(new Notification(NotificationType.warning,
                          new A(Localization.Global.TrialWillExpire(ProGetConfig.Licensing.EnterpriseTrial ? "Enterprise" : "Basic", days))
                          { Href = LicensingOverviewPage.BuildUrl() }
                      ));
              }                    
              else if (expiresDate != null && expiresDate.Value.Subtract(DateTime.Now).TotalDays <= 45)
              {
                  var days = (int)expiresDate.Value.Subtract(DateTime.Now).TotalDays;
                  notifications.Add(new Notification(NotificationType.warning,
                      new A(Localization.Global.KeyWillExpire(days))
                      { Href = LicensingOverviewPage.BuildUrl() }
                  ));
              }
              
      
              var extensions = ExtensionsManager.GetExtensions();
      
              if (!extensions.Any() || extensions.All(e => !e.LoadResult.Loaded))
              {
                  notifications.Add(new Notification(
                      NotificationType.error,
                      InfoBlock.Error(
                          new A(Localization.Global.ExtensionLoadError) { Href = Pages.Administration.Extensions.ExtensionsOverviewPage.BuildUrl() }
                      )
                  ));
              }
      
              if (WUtil.ShowDockerRestartMessage)
              {
                  notifications.Add(
                      new Notification(
                          NotificationType.warning,
                          InfoBlock.Warning(Localization.Global.ContainerRestartNeeded)
                      )
                  );
              }
          }
         
          var v2Notifications = ShowV2DeprecatedQueriesUsedWarning(context);
          if (v2Notifications.Any())
          {
              notifications.AddRange(v2Notifications.Select(f => new Notification(
                  NotificationType.warning,
                  InfoBlock.Warning(new A(ManageFeedPropertiesPage.BuildUrl(f.Feed_Id), $"{f.Feed_Name} is using deprecated ODATA (V2) Queries."))
              )));
          }
      
          return notifications;
      }
      
      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet - Exception: An error occurred in the web application: Nullable object must have a value.

      Hi @jw ,

      Can you try restarting the web application (Admin > Manage Service)? Hopefully it will be resolved after that.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet v24 fails to initialize/upgrade the SQL Server database

      Hi @husterk_2844 ,

      Those are some strange errors and I haven't seen them before. It seems that something is wrong with SQL Server.

      What's interesting/notable here is that SQL Server is saying Incorrect syntax near 'GO' on a the 20 CREATE TYPES.sql script. That hasn't changed in 10+ years... and it's also unlikely that the latest SQL Server 2022 is failing to parse/execute that script, but I can't think of anything else.

      The error occurring on 10 SET DATABASE PROPERTIES.sql is also peculiar; that script has a TRY/CATCH, so it's not considered failure -- but the expected error is definitely not "ALTER DATABASE statement not allowed within multi-statement transaction."

      From here, I would "play around" with different versions of SQL Server and ProGet. We've never seen anything like this, so don't really know where to start.

      As for ProGet failing to start... a database failure would definitely yield that behavior, so that's not really an issue. The question really is why these database errors are occurring.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: How to use buildmaster to handle Vue project

      Hi @506576828_9736 ,

      We don't have any templates or wizards for Vue project, but you can follow the guidance on Creating a Build Script from Scratch, which would involve the following steps:

      1. Get Source
      2. Compile Project (I think this involves running npm?)
      3. Record Dependencies [optional]
      4. Run Tests [optional]
      5. Capture Artifacts

      Once you capture the artifact, you can use or customize the Deploy Artifacts to File Share Script Template

      Please share your experience, as it'll help future users searching for this as well :)

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Using multi-level feeds with passthrough is failing

      Thanks @johnsen_7555! If I can offer some advice....

      The workflow you're creating is a bit complicated, and adding in the automation component is "yet another product/process" to own/maintain. On our end, we get support inquiries from confused new administrators who notice "undocumented" behavior (i.e. not on docs.inedo.com) in ProGet.

      If you're not "worried" about malicious packages, then the main risks you are mitigating are:

      • legal liability with using wrong licenses
      • vulnerabilities in your own software via OSS packages
      • developer time in fixing the aforementioned problems

      Both licenses and vulnerabilities are only a problem if they go to production, and keep in mind that vulnerabilities need to be monitored after a package is being used, since they are often discovered long after the package is used in your production software.

      How about something like this:

      • block noncompliant packages
      • set A/GPL licenses to be noncompliant
      • auto assess severe vulnerabilities , set those to be noncompliant
      • set unknown licenses an unassessed vulnerabilities to be warn
      • use pgutil in your CI/CD pipeline to prevent unaddressed warn from going to production
      • routinely monitor warn packages as you have time

      Note that, in a future version of ProGet, we intend to add more intelligence to package analysis for OSS packages. For example, we would like to say "this nunnpy package has 1 version, is recently published, has no GitHub repo, etc., and therefore is noncompliant".

      posted in Support
      stevedennis
      stevedennis
    • RE: Using multi-level feeds with passthrough is failing

      Hi @johnsen_7555,

      I'm not really sure I totally understand the automated worfklow you want to create; you mentioned earlier having an approval process?

      Are there any gaps with the workflow I mentioned? Basically two feeds (approved, unapproved), which you then use package promotion as the approval action.

      We don't recommend using webhooks to automate ProGet itself. This can create some loops that will cause headaches.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Using multi-level feeds with passthrough is failing

      Hi @johnsen_7555 ,

      Ah ha, thanks for clarifying that!

      This is the expected behavior, and the reason is a bit complex.

      Unlike most package repositories, the PyPI Repository API (which a ProGet feed implements) does not provide any licensing information about packages. It's just a very basic listing of names and versions, which means that there is no license information (or description, author, etc). All of that is embedded in the package files.

      However, pypi.org has a special API that ProGet queries to provide more information about a package hosted on pypi.org. This way, description and license information can be displayed on remote packages. But this API is only for pypi.org, and the pip client doesn't use it.

      When you connect to another feed in ProGet, the regular API is used. And since the PyPi Repository API doesn't provide package metadata, this information isn't available. It's on our long-term roadmap to use a special API / method for ProGet->ProGet connections, but that's a ways off and requires a lot of internal refactoring.

      That said, the workflow we support to accomplish what you want is as follows:
      https://blog.inedo.com/python/pypi-approval-workflow/

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Using multi-level feeds with passthrough is failing

      Thanks for clarifying @johnsen_7555.

      I'm struggling a bit to see what kind of configuration might cause this issue or reproduce the issue. Is your python-accessible feed connected directly to PyPi.org?

      If you go to re-analyze the package, you should get a really long set of debug logs (no need to send them). But after you do that, can you try the download again?

      posted in Support
      stevedennis
      stevedennis
    • RE: Using multi-level feeds with passthrough is failing

      Hi @johnsen_7555 ,

      Sounds like you're building a sort of Python Package Approval Workflow, which is great to see.

      If the user doesn't have permission to download the file, I would expect a 401 (if anonymous) or 403 if authenticated.

      A 400 error is a bad request. It could be coming from ProGet, as ProGet will occasionally throw that message when there is unexpected input. But it could also be coming from an intermediate server that's processing the request before forwarding to ProGet.

      In this case, I believe pip is simply just performing a GET on the URL in the error message:

      .../download/numpy/2.0.0/numpy-2.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=6d7696c615765091cc5093f76fd1fa069870304beaccfd58b5dcc69e55ef49c1

      I'm not 100% sure that's what pip is doing, but why don't doing a curl -v against that URL and see if you also get a 400?

      If so, then you should get an error in the message body from curl. ProGet will write this out to the response stream.

      If not, then you'll need to capture the traffic and see what the difference is. Maybe it's a header that's different? I'm not sure what would cause ProGet to yield a 400.

      Let us know what you find,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet Connector Timeout

      Hi @scott-wright_8356 ,

      Once connector caching is enabled, the error pattern is not used, so we only have this warning. I added a small change via PG-2726 which will add the connector name. This will appear in the next maintenance release (2024.9), scheduled for this week.

      Removing connector caching should reveal the connector name, so maybe that helps you identify it until then

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet SCA: SBOM export does not work + UI issue

      @jw thanks for clarifying!

      The pipelines in ProGet are not really meant to track status. The main reason for the build stages is to control automatic archival, issue creation, and notifications. For example, your stable releases might stay in a "Released" stage indefinitely.

      We also had (in the preview feature) threre build statues: Active, Archived, Released. We don't use Released currently, but definitely something we may bring back. You don't want to delete a Released build, but you would probably want to delete an Archived build.

      Anyway, I'd check out the pgutil builds promote command - that way you can keep your "archival rules" in ProGet.

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet SCA: SBOM export does not work + UI issue

      Hi @jw,

      We'd love to learn more - why not?

      We envisioned that there would be lots and lots of builds in the Build stage (i.e. created by a CI server), and the ones released might got to a stage like Production.

      Thanks

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet SCA: SBOM export does not work + UI issue

      @jw this was not addressed; the "build pipelines" are fairly primitive ProGet 2024, and we plan to review it as a whole as we get more feedback from users

      If you click on "all builds" that might be the view you are expecting

      posted in Support
      stevedennis
      stevedennis
    • RE: ProGet: UI 403 errors

      Hi @jw ,

      We'll address that via PG-2718 by displaying a message on the Bulk Edit/Promote Pages if the user lacks permission to delete or promote the selected packages.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Support for Rust Cargo packages

      Hi @fabrice-mejean_6049 , @brett-polivka

      Thanks for the feedback!

      With ProGet 2023 out of the way (which made developing new feeds a lot simpler), and the addition of the Cargo Registry Web API, this is something we're much more open to implementing.

      We want a user partner, someone who is already using Rust/Cargo and would be able to work with testing/developing this for us. Currently there hasn't been any market demand from prospects or other users.... and for your team, seems like a "nice to have" more than anything.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Error for Visual Studio Extensions feed

      Great!

      I don't believe that there is any public reporting of VISX package vulnerabilities, so this is not data that would be available in ProGet either

      posted in Support
      stevedennis
      stevedennis
    • RE: Error for Visual Studio Extensions feed

      Hi @arkady-karasin_6391 ,

      Thanks for sharing the error; basically this error means that you're connecting to an invalid feed/URL. You can only connect to a VISX feed, which is an ATOM-based format.

      Note: You cannot mirror the "official" Visual Studio Gallery URL, as that is technically not a VISX gallery. It uses a different and undocumented format, and Visual Studio will not recognize that format for anything except official gallery.

      Connectors in VISX aren't very useful, and really are only useful for connecting to other ProGet instances.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Create a grouped "new" Debian feed using the API

      Hi @Scati,

      It looks like there isn't a way to do this currently; we'll fix this via PG-2716 in an upcoming maintenance release. Maybe this week's, but more likely two weeks from now.

      In ProGet 2024, debian should create a "Debian" feed, and debianlegacy will create a "Debian (Legacy) Feed. We'll also add a feedGroup property.

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • RE: Otter "reverse" way working

      Hi @daniels,

      This is technically possible, but it's not something that we support out-of-the-box I'm afraid.

      The closest way to accomplish something like this would be:

      • Use the Otter API to create a server entry with the appropriate roles (similar to what you'd do in the UI)
      • Install/run the agent on the computer which will connect to Otter
      • Use the Otter API to trigger a job/ remediation
      • Optionally, remove the server from Otter and Stop/uninstall the Agent

      So it's not easy from the user perspective. However. if Otter and this approach looks like it will work for you, then it's something we can definitely explore together as a user/customer, and build out a use case / case study / etc. That'd be best to start a conversation w/ someone in our customer / sales team on that.

      Otherwise, there really hasn't been enough demand for this particular use case, and it's hard enough to market Otter's other use cases, let alone develop build new ones ;)

      FYI - note that you can also run the Inedo Agent in outgoing mode:
      https://docs.inedo.com/docs/inedoagent-overview#communication-modes

      Thanks,
      Steve

      posted in Support
      stevedennis
      stevedennis
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 3 / 8