Navigation

    Inedo Community Forums

    Forums

    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    1. Home
    2. atripp
    3. Posts
    • Profile
    • Following
    • Followers
    • Topics
    • Posts
    • Best
    • Groups

    Posts made by atripp

    • RE: Incorrect published date handling breaks min-release-age for npm feeds

      Hi @aleksander-szczepanek_3253 ,

      If you navigate to Admin > Advanced Settings and check "Use Connector Publish Date", then this will behave as you expect. Note that you will need to delete already-cached packages.

      This will be default behavior in ProGet 2026+

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Transfer License: Active On Two Servers Temporarily

      Hi @denis-krienbuehl_4885 ,

      Thanks for checking; for a short-term like this no problem!

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Supported Database for ProGet HA Installations

      Hi @EnterpriseVirtualization_2441 ,

      We do not recommend using SQL Server Availability groups..

      For a product like ProGet, a single database node is all that's required -- and it's strongly recommended.

      There is no practical benefit to a clustered database here - on the contrary, it makes the product slower, less stable, and more costly/complex to maintain. As such, InedoDB does not support clustering.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Support for kubernetes-based deployment of ProGet and InedoDB?

      Hi @jeff-williams_1864 ,

      ProGet for Linux (Docker) is fully supported. You deploy it how you'd like, and many customers use container orchestration platforms like Kubernetes with no problem.

      However, we only provide step-by-step instructions for Docker. This is intentional, as these platforms are quite complex and require a lot of skills to configure, maintain, and troubleshoot.

      While we try to help support "platform issues" on Windows (i.e. everything from permissions to Domain configuration), that's a lot more straightforward for us to support -- and Microsoft can pick up the slack (e.g. a failed Windows update, etc).

      So long story short, if you are comfortable with Kubernetes/Openshift, feel free to use it. But otherwise, we don't want ProGet to be our users' "first Kubernetes" experience :)

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: https://docs.inedo.com/docs/proget/api/pgutil#sources ~/.config/pgutil/ pgutil.config correction

      Thanks for pointing that out @rcpa0 ! I've just updated the docs now.

      posted in Support
      atripp
      atripp
    • RE: Supported Database for ProGet HA Installations

      Hi @jeff-williams_1864 ,

      You mentioned that you're "using the embedded database at the moment", which I take to mean that you're not using a separate SQL Server container image.

      The In that case, the only options for a clustered installation is using InedoDB (recommended) or an External PostgreSQL (not recommended).

      If you were using SQL Server, then SQL Server would be supported for a clustered instance as well. However, we are moving away from SQL Server, so we definitely wouldn't recommend it on a new installation.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Proget is unable to download Maven packages that use a nonstandard versioning scheme

      Hi @devops-user @joshua-mitchell_8090 ,

      Thank you so much for testing! We'll merge this in via PG-3251 in tomorrow's maintenance release.

      As for the other error, it's technically unrelated - but that package has such a long "compliance analysis report" that it's getting truncated in the database cache. PostgreSQL complains about that, SQL Server silently does it. Anyway w'ell fix via PG-3250 perhaps in tomorrow's release as well.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Docker _catalog and tags calls do not respect tokens

      Hi @Stephen-Schaff ,

      The Docker API does not use API keys but a ticket-based system (i.e. docker login). Here is how to use it:
      https://docs.inedo.com/docs/proget/docker/semantic-versioning#example-powershell-script-to-authenticate-to-docker

      We added some kind of support via PG-3206 in ProGet 2025.20, though it was only intended to address self-connectors to Docker registries. I do'nt know how well it will work here.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Composer feed: metapackage is not saved as a local package

      Hi @vdubrovskyi_1854 ,

      Unfortunately this is just how composer works; it never requests the metapackage from the server (i.e. ProGet) nor does it upload the composer.lock file to ProGet.

      There is obviously no way for ProGet to "guess" what metapackages you may want. Obviously, ProGet does not automatically download/install every metapackage from the upstream repository. That's obviously not behavior anyone would want, and we will not add it to ProGet.

      You have two options:

      1. Modify the behavior of composer to request these packages from ProGet,
      2. Writing a script to parse your composer.lock and then download and/or promote those files within ProGet

      Hope that helps,

      Alana

      posted in Support
      atripp
      atripp
    • RE: Composer feed: metapackage is not saved as a local package

      Hi @vdubrovskyi_1854 ,

      I'm not an expert on how Composer handles packages, but so far as I can tell the behavior you’re seeing is expected and is how metapackage types are handled.

      A metapackage does not contain any files and is not installed into the vendor/ directory. It exists only to define dependencies on other packages. There is no contents in the package and thus, there is nothing for Composer to fetch.

      Because of this:

      • It will appear in composer.lock as part of dependency resolution
      • It will not create a directory under vendor/
      • The content itself is not be fetched (downloaded) from Composer
      • Only the Composer API is queried

      ProGet can only cache a package when a download/fetch occurs. Since metapackages are not fetched, there is nothing to cache.

      When you downloaded manually, you are deviating from Composer’s normal install behavior for metapackages -- so that's why it appears.

      In summary, this behavior is expected and not an error in ProGet. Unfortunately there's no way for ProGet to cache these packages, since Composer never downloads them.

      Hope that helps,

      Alana

      posted in Support
      atripp
      atripp
    • RE: [Buildmaster] Add queryable custom properties on a deployment level

      Hi @Anthony,

      I'm afraid not; the deploymentinfo is part of the BuidMaster API, and it's effectively reading data from the database that you'd otherwise see in the UI. Its fairly "disconnected" from runtime execution.

      The only persistent (i.e. outside of runtime) variables are going to be configuration (i.e. Build-scoped) variables. I suppose one thing you could do is define multiple build variables (e.g. $MyTarget1=value, $MyTarget2=value2).

      You could also store a map variable on the build, like %MyMap = %(MyTarget1: value, MyTarget2: value2) -- although that might involve a bit of awkward OtterScript to get working.

      It's not a use case we designed for.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Proget is unable to download Maven packages that use a nonstandard versioning scheme

      Hi @devops-user ,

      No problem, I just pushed Release 2025.25-rc.1.

      It's based simply the 2025.24 release with the Maven patch added in. You can install like this:
      https://docs.inedo.com/docs/installation/windows/inedo-hub/howto-install-prerelease-product-versions

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: proget.inedo.com DDOSed?

      Thanks for the heads up @felfert !

      Looks like we're back now, looks like there was some issue reporting the outage on our end :)

      Alana

      posted in Support
      atripp
      atripp
    • RE: [Buildmaster] Add queryable custom properties on a deployment level

      hi @Anthony ,

      This sounds like a great use-case for variables; you can programmatically set them in OtterScript using an operation (check your /reference/operations page to learn what the name is, I think it was renamed in modern versions of BUildMaster), and then query them with the variables endpoint: builds/«application-name»/«release-number»/«build-number»

      Here's a link to the documentation:
      https://docs.inedo.com/docs/buildmaster/reference/api/variables

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Proget is unable to download Maven packages that use a nonstandard versioning scheme

      Hi @devops-user ,

      Can you try out the container image above? We'd like to get confirmation that it's working and we can then merge it to ProGet 2025 (we were otherwise planning ProGet 2026).

      Thanks
      Alana

      posted in Support
      atripp
      atripp
    • RE: Proget is unable to download Maven packages that use a nonstandard versioning scheme

      Thanks @joshua-mitchell_8090 , we'll consider merging it in then!

      As for "how the dependencies are identified within the project build vulnerabilities", I suppose so - the IncrementalVersion2 will allow for proper vulnerability associated with packages that use "incorrect" versions (like 1.2.3.4). Jackson Databind is the one we kept coming across.

      Note you can request another trial key from my.inedo.com to try it out :)

      posted in Support
      atripp
      atripp
    • RE: ProGet SBOM Scan Not Creating Vulnerability Issues for NPM Packages

      Hi @_moep_ ,

      So there are quite a few "moving pieces" here.

      Vulnerability -> Assessment -> Compliance -> Build Issue

      Vulnerabilities & Assessments

      First and foremost, when you navigate to qs@0.6.6 in the ProGet UI, you should see several vulnerabilities listed, such PGV-2287703. So, the "identification" is there as a result of the offline version of that database being included with ProGet.

      But, ProGet is all about reducing noise while helping elevate real risks - and most vulnerabilities are theoretical, have no real-world exploits, would require a dedicated attacker, and would result tin no real damage.

      A "Denial of Service from Prototype Pollution" is great example of such a vulnerability. The risks and problems introduced by reactively upgrading every dependency far exceed any benefits -- moreover, it "de-sensitizes" everyone to real security risks. The idea of "when everything is severe nothing is" is the same as "when everything is a priority, nothing is".

      That's where Assessment comes in. In ProGet 2025 and earlier, a vulnerability is generally as "assessed" Ignored, Warn, or Blocked. PGV-2287703 will be assessed as Warn by default.

      **NOTE this will be changing in ProGet 2025. **

      Policies & Compliance

      Next, there's the question of Compliance; the vulnerability assessment (among other things, like license, deprecation status, etc) will determines whether or not a package is Compliant, Noncompliant, or Warn.

      Compliance rules are configured in policies. In ProGet 2025, by default, the "Warn" Assessment will not make a package Noncompliant. Just Warn.

      Builds & Issues

      A Build is considered Noncompliant if any of the packages are Noncompliant. A Noncomplaint build should be blocked from deploying to production.

      This is where Issues come in: an issue may be created when a build is analyzed (try it out by clicking [analyze] in the UI) for a Noncompliant package. The purpose of these Issues are to effectively "override" the compliance status on a single package.

      They are not informational; if you want a list of packages, vulnerabilities, licenses, just use pgutil builds audit to get that listing.

      Long story short, I'd decide on a process you'd want to use before even considering web hooks for all this.

      Also note that this mostly requires a paid license, so you may not even be getting functionality if you're on a free version

      hope that helps,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Support for Air-gapped environments

      Hi @steviecoaster ,

      Offline / air-gapped installation is common and a documented use-case:
      https://docs.inedo.com/docs/installation/windows/inedo-hub/offline

      As the article mentions, you can download the "offline installer", which is essentially a
      self-extracting zip file that runs a Custom Installer created using the Inedo Hub.

      That .exe file is not suitable for automation, so if that's a requirement then you'll need to use an alternative if you wanted to automate upgrade/installation. That article outlines a few concepts, but ultimately it really depends how "air-gapped" we're talking here.

      If we're talking a SCIF with "security-guard inspected installation media", then I don't think automation is really going to really get you much ;)

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Allow networkservice to use the DB in Proget

      Hi @reseau_6272 ,

      Just to confirm, you've switched the ProGet service from a domain account to use Network Service, and when starting the service you're getting some kind of permission error from SQL Server?

      The easiest solution is to simply switch to using a username/password instead of Windows Integrated Authentication and edit the connection string appropriately. Keeping in mind that, eventually, you will need to move away from SQL Server and migrate to PostgreSQL, which will not have these issues.

      Otherwise, you will explicitly need to grant a login to the machine account. Network Service is represented in SQL Server as the machine account (e.g., DOMAIN\MACHINENAME$), and the identity needs to be explicitly created CREATE LOGIN [MYDOMAIN\WEB01$] FROM WINDOWS; before you can assign permissions.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: [ProGet] Unexpected redirect when accessing Maven package with non-standard version starting with a character

      Hi @koksime-yap_5909,

      Good news, it's available now for testing! We're considering merging to ProGet 2025, or maybe keeping for ProGet 2026?

      Anyway, I posted a lot more detail now:
      https://forums.inedo.com/topic/5696/proget-is-unable-to-download-maven-packages-that-use-a-nonstandard-versioning-scheme/2

      Thanks,
      Alana

      FYI -- locked this, in case anyone has comments/questions on that change, I guess that post will be the "official" thread at this point :)

      posted in Support
      atripp
      atripp
    • RE: Symbol Server id issue

      Hi @it_9582 ,

      Thanks for checking that! Well, I'm not sure then :)

      From here, how about sending us the package? Then I can upload it and see about debugging in ProGet to find out where it's coming from.

      If you can open a ticket and reference QA-3010 somewhere it'll link the issues right-up on our dashboard. Then you can attach the file to that ticket.

      We'll respond on there, and eventually update this thread once we figure out the issue.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Proget is unable to download Maven packages that use a nonstandard versioning scheme

      Hi @joshua-mitchell_8090 ,

      Thanks for the inquiry! The changes are available in the inedo/proget:25.0.24-ci.4 container image, and we'd love to get a second set of eyes. Are you using Docker?

      They're relatively simple, but we just avoid changing stuff like this in maintenance releases... so it's currently slated for ProGet 2026.

      But it should be okay for a maintenance release. Please let us know, we'll decide to release based on your or other user feedback.

      Here's what we changed.

      First, we added a "sixth" component called IncrementalVersion2 that will support versions like 1.2.3.4-mybuild-678 (where 4 is the second incrementing version), so that vulnerability identification can work better. Our implementation is based on the the Maven version specs, which in retrospect, seems to be followed only by ProGet. Pretty low risk here.

      Second, we changed our "path parsing" logic, which identifies the groupId, artifactId, version, artifactType from a string like /junit/junit/4.8.2/junit-4.8.2.jar into /mygroup/more-group/group-42/my-artifact/1.0-SNAPSHOT/maven-metadata.xml.

      It's a little hard to explain, so I'll just share the new and old logic:

      //OLD: if (urlPartsQ.TryPeek(out string? maybeVersion) && char.IsNumber(maybeVersion, 0))
      if (urlPartsQ.TryPeek(out string? maybeVersion) && (
          char.IsNumber(maybeVersion, 0)
          || maybeVersion.EndsWith("-SNAPSHOT", StringComparison.OrdinalIgnoreCase)
          || (this.FileName is not null && !this.FileName.Equals("maven-metadata.xml", StringComparison.OrdinalIgnoreCase))
          ))
      {
          this.Version = maybeVersion;
          urlPartsQ.Pop();
      }
      

      Long story short, this seems to work fine for v8.5.0 and shouldn't break unless someone is uploading improperly named artifact files (e.g. my-group/my-artifact/version-1000/maven-metadata.xml or e.g. my-photo/cool-snapshot/hello-kitty.jpg).

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: [ProGet] Unexpected redirect when accessing Maven package with non-standard version starting with a character

      Hi @koksime-yap_5909 ,

      Just as a quick update! Given that this is a more wide-spread problem, we've fixed the code and plan to release in ProGet 2026 (or possibly sooner, if we can make it low-risk enough in a mainteannce release).

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Symbol Server id issue

      Hi @it_9582 ,

      Sorry it looks like we're dealing with a lot more code than I expected we would. I really don't know what to look at, and neither your code or our code makes sense (it's been many many years since anyone edited it.

      I'm not sure if it's helpful, but I'll share the code of our class. If you spot anything simple to change, we can explore it. Otherwise, I think the only way to move forward would be for you to share us some example nuget packages that we can set a debugger to.

      Here's the MicrosoftPdbFile class, which i combined into one giant string here:

      using System;
      using System.Collections;
      using System.Collections.Generic;
      using System.Collections.Immutable;
      using System.IO;
      using System.Linq;
      using System.Text;
      
      namespace Inedo.ProGet.Symbols;
      
      /// <summary>
      /// Provides access to the data contained in a Microsoft PDB file.
      /// </summary>
      public sealed class MicrosoftPdbFile : IDisposable, IPdbFile
      {
          private RootIndex root;
          private Dictionary<string, int> nameIndex;
          private bool leaveStreamOpen;
          private bool disposed;
      
          /// <summary>
          /// Initializes a new instance of the <see cref="MicrosoftPdbFile"/> class.
          /// </summary>
          /// <param name="stream">Stream which is backed by a PDB file.</param>
          /// <param name="leaveStreamOpen">Value indicating whether to leave the stream open after this instance is disposed.</param>
          public MicrosoftPdbFile(Stream stream, bool leaveStreamOpen)
          {
              if (stream == null)
                  throw new ArgumentNullException(nameof(stream));
      
              this.leaveStreamOpen = leaveStreamOpen;
              this.Initialize(stream);
          }
      
          /// <summary>
          /// Gets the PDB signature.
          /// </summary>
          public uint Signature { get; private set; }
          /// <summary>
          /// Gets the PDB age.
          /// </summary>
          public uint Age { get; private set; }
          /// <summary>
          /// Gets the PDB guid.
          /// </summary>
          public Guid Guid { get; private set; }
      
          ImmutableArray<byte> IPdbFile.Id => this.Guid.ToByteArray().ToImmutableArray();
          bool IPdbFile.IsPortable => false;
      
          /// <summary>
          /// Returns a stream backed by the data in a named PDB stream.
          /// </summary>
          /// <param name="streamName">Name of the PDB stream to open.</param>
          /// <returns>Stream backed by the specified named stream.</returns>
          public Stream OpenStream(string streamName)
          {
              if (streamName == null)
                  throw new ArgumentNullException(nameof(streamName));
      
              int? streamIndex = this.TryGetStream(streamName);
              if (streamIndex == null)
                  throw new InvalidOperationException($"Stream {streamName} was not found.");
      
              return this.root.OpenRead((int)streamIndex);
          }
          /// <summary>
          /// Returns an enumeration of all of the stream names in the PDB file.
          /// </summary>
          /// <returns>Enumeration of all stream names.</returns>
          public IEnumerable<string> EnumerateStreams() => this.nameIndex.Keys;
          /// <summary>
          /// Returns an enumeration of all of the source file names in the PDB file.
          /// </summary>
          /// <returns>Enumeration of all of the source file names.</returns>
          public IEnumerable<string> GetSourceFileNames()
          {
              var srcFileNames = this.EnumerateStreams()
                  .Where(s => s.StartsWith("/src/files/", StringComparison.OrdinalIgnoreCase))
                  .Select(s => s.Substring("/src/files/".Length))
                  .ToHashSet(StringComparer.OrdinalIgnoreCase);
      
              try
              {
                  using (var namesStream = this.OpenStream("/names"))
                  using (var namesReader = new BinaryReader(namesStream))
                  {
                      namesStream.Position = 8;
                      int length = namesReader.ReadInt32();
                      long endPos = length + 12;
      
                      while (namesStream.Position < endPos && namesStream.Position < namesStream.Length)
                      {
                          try
                          {
                              var name = ReadNullTerminatedString(namesReader);
                              if (name.Length > 0 && Path.IsPathRooted(name))
                                  srcFileNames.Add(name);
                          }
                          catch
                          {
                              // Can't read name
                          }
                      }
                  }
              }
              catch
              {
                  // Can't enumerate names stream
              }
      
              return srcFileNames;
          }
      
          /// <summary>
          /// Closes the PDB file.
          /// </summary>
          public void Close()
          {
              if (!this.disposed)
              {
                  this.root.Close(this.leaveStreamOpen);
                  this.disposed = true;
              }
          }
          void IDisposable.Dispose() => this.Close();
      
          private void Initialize(Stream stream)
          {
              var fileSignature = new byte[0x20];
              stream.Read(fileSignature, 0, fileSignature.Length);
      
              this.root = new RootIndex(stream);
      
              using (var sigStream = this.root.OpenRead(1))
              using (var reader = new BinaryReader(sigStream))
              {
                  uint version = reader.ReadUInt32();
                  this.Signature = reader.ReadUInt32();
                  this.Age = reader.ReadUInt32();
                  this.Guid = new Guid(reader.ReadBytes(16));
      
                  this.nameIndex = ReadNameIndex(reader);
              }
          }
          private int? TryGetStream(string name) => this.nameIndex.TryGetValue(name, out int index) ? (int?)index : null;
      
          private static Dictionary<string, int> ReadNameIndex(BinaryReader reader)
          {
              int stringOffset = reader.ReadInt32();
      
              var startOffset = reader.BaseStream.Position;
              reader.BaseStream.Seek(stringOffset, SeekOrigin.Current);
      
              int count = reader.ReadInt32();
              int hashTableSize = reader.ReadInt32();
      
              var present = new BitArray(reader.ReadBytes(reader.ReadInt32() * 4));
              var deleted = new BitArray(reader.ReadBytes(reader.ReadInt32() * 4));
              if (deleted.Cast<bool>().Any(b => b))
                  throw new InvalidDataException("PDB format not supported: deleted bits are not 0.");
      
              var nameIndex = new Dictionary<string, int>(hashTableSize + 100, StringComparer.OrdinalIgnoreCase);
      
              for (int i = 0; i < hashTableSize; i++)
              {
                  if (i < present.Length && present[i])
                  {
                      int ns = reader.ReadInt32();
                      int ni = reader.ReadInt32();
      
                      var pos = reader.BaseStream.Position;
                      reader.BaseStream.Position = startOffset + ns;
                      var name = ReadNullTerminatedString(reader);
                      reader.BaseStream.Position = pos;
      
                      nameIndex.Add(name, ni);
                  }
              }
      
              return nameIndex;
          }
          private static string ReadNullTerminatedString(BinaryReader reader)
          {
              var data = new List<byte>();
              var b = reader.ReadByte();
              while (b != 0)
              {
                  data.Add(b);
                  b = reader.ReadByte();
              }
      
              return Encoding.UTF8.GetString(data.ToArray());
          }
      
          private sealed class PagedFile : IDisposable
          {
              private LinkedList<CachedPage> pages = new LinkedList<CachedPage>();
              private Stream baseStream;
              private readonly object lockObject = new object();
              private BitArray freePages;
              private uint pageSize;
              private uint pageCount;
              private bool disposed;
      
              public PagedFile(Stream baseStream, uint pageSize, uint pageCount)
              {
                  this.baseStream = baseStream;
                  this.pageSize = pageSize;
                  this.pageCount = pageCount;
                  this.CacheSize = 1000;
              }
      
              public int CacheSize { get; }
              public uint PageSize => this.pageSize;
              public uint PageCount => this.pageCount;
      
              public void InitializeFreePageList(byte[] data)
              {
                  this.freePages = new BitArray(data);
              }
              public byte[] GetFreePageList()
              {
                  var data = new byte[this.freePages.Count / 8];
                  for (int i = 0; i < data.Length; i++)
                  {
                      for (int j = 0; j < 8; j++)
                      {
                          if (this.freePages[(i * 8) + j])
                              data[i] |= (byte)(1 << j);
                      }
                  }
      
                  return data;
              }
              public byte[] GetPage(uint pageIndex)
              {
                  if (this.disposed)
                      throw new ObjectDisposedException(nameof(PagedFile));
                  if (pageIndex >= this.pageCount)
                      throw new ArgumentOutOfRangeException();
      
                  lock (this.lockObject)
                  {
                      var page = this.pages.FirstOrDefault(p => p.PageIndex == pageIndex);
                      if (page != null)
                      {
                          this.pages.Remove(page);
                      }
                      else
                      {
                          var buffer = new byte[this.pageSize];
                          this.baseStream.Position = this.pageSize * pageIndex;
                          this.baseStream.Read(buffer, 0, buffer.Length);
                          page = new CachedPage
                          {
                              PageIndex = pageIndex,
                              PageData = buffer
                          };
                      }
      
                      while (this.pages.Count >= this.CacheSize)
                      {
                          this.pages.RemoveLast();
                      }
      
                      this.pages.AddFirst(page);
      
                      return page.PageData;
                  }
              }
              public void Dispose()
              {
                  this.baseStream.Dispose();
                  this.pages = null;
                  this.disposed = true;
              }
      
              private sealed class CachedPage : IEquatable<CachedPage>
              {
                  public uint PageIndex;
                  public byte[] PageData;
      
                  public bool Equals(CachedPage other) => this.PageIndex == other.PageIndex && this.PageData == other.PageData;
                  public override bool Equals(object obj) => obj is CachedPage p ? this.Equals(p) : false;
                  public override int GetHashCode() => this.PageIndex.GetHashCode();
              }
          }
          private sealed class PdbStream : Stream
          {
              private RootIndex root;
              private StreamInfo streamInfo;
              private uint position;
      
              public PdbStream(RootIndex root, StreamInfo streamInfo)
              {
                  this.root = root;
                  this.streamInfo = streamInfo;
              }
      
              public override bool CanRead => true;
              public override bool CanSeek => true;
              public override bool CanWrite => false;
              public override long Length => this.streamInfo.Length;
              public override long Position
              {
                  get => this.position;
                  set => this.position = (uint)value;
              }
      
              public override void Flush()
              {
              }
              public override int Read(byte[] buffer, int offset, int count)
              {
                  if (buffer == null)
                      throw new ArgumentNullException(nameof(buffer));
      
                  int bytesRemaining = Math.Min(count, (int)(this.Length - this.position));
                  int bytesRead = 0;
      
                  while (bytesRemaining > 0)
                  {
                      uint currentPage = this.position / this.root.Pages.PageSize;
                      uint currentPageOffset = this.position % this.root.Pages.PageSize;
      
                      var page = this.root.Pages.GetPage(this.streamInfo.Pages[currentPage]);
      
                      int bytesToCopy = Math.Min(bytesRemaining, (int)(this.root.Pages.PageSize - currentPageOffset));
      
                      Array.Copy(page, currentPageOffset, buffer, offset + bytesRead, bytesToCopy);
                      bytesRemaining -= bytesToCopy;
                      this.position += (uint)bytesToCopy;
                      bytesRead += bytesToCopy;
                  }
      
                  return bytesRead;
              }
              public override int ReadByte()
              {
                  if (this.position >= this.Length)
                      return -1;
      
                  uint currentPage = this.position / this.root.Pages.PageSize;
                  uint currentPageOffset = this.position % this.root.Pages.PageSize;
      
                  var page = this.root.Pages.GetPage(this.streamInfo.Pages[currentPage]);
                  this.position++;
      
                  return page[currentPageOffset];
              }
              public override long Seek(long offset, SeekOrigin origin)
              {
                  switch (origin)
                  {
                      case SeekOrigin.Begin:
                          this.position = (uint)offset;
                          break;
      
                      case SeekOrigin.Current:
                          this.position = (uint)((long)this.position + offset);
                          break;
      
                      case SeekOrigin.End:
                          this.position = (uint)(this.Length + offset);
                          break;
                  }
      
                  return this.position;
              }
              public override void SetLength(long value) => throw new NotSupportedException();
              public override void Write(byte[] buffer, int offset, int count) => throw new NotSupportedException();
              public override void WriteByte(byte value) => throw new NotSupportedException();
          }
          private sealed class RootIndex
          {
              private BinaryReader reader;
              private List<StreamInfo> streams = new List<StreamInfo>();
              private StreamInfo rootStreamInfo;
              private StreamInfo rootPageListStreamInfo;
              private uint freePageMapIndex;
      
              public RootIndex(Stream stream)
              {
                  this.reader = new BinaryReader(stream);
                  this.Initialize();
              }
      
              public PagedFile Pages { get; private set; }
      
              public Stream OpenRead(int streamIndex)
              {
                  var streamInfo = this.streams[streamIndex];
                  return new PdbStream(this, streamInfo);
              }
              public void Close(bool leaveStreamOpen)
              {
                  if (!leaveStreamOpen)
                      this.reader.Dispose();
              }
      
              private void Initialize()
              {
                  this.reader.BaseStream.Position = 0x20;
                  var pageSize = this.reader.ReadUInt32();
                  var pageFlags = this.reader.ReadUInt32();
                  var pageCount = this.reader.ReadUInt32();
                  var rootSize = this.reader.ReadUInt32();
                  this.reader.ReadUInt32(); // skip reserved
      
                  this.Pages = new PagedFile(this.reader.BaseStream, pageSize, pageCount);
                  this.freePageMapIndex = pageFlags;
      
                  // Calculate the number of pages needed to store the root data
                  int rootPageCount = (int)(rootSize / pageSize);
                  if ((rootSize % pageSize) != 0)
                      rootPageCount++;
      
                  // Calculate the number of pages needed to store the list of pages
                  int rootIndexPages = (rootPageCount * 4) / (int)pageSize;
                  if (((rootPageCount * 4) % (int)pageSize) != 0)
                      rootIndexPages++;
      
                  // Read the page indices of the pages that contain the root pages
                  var rootIndices = new List<uint>(rootIndexPages);
                  for (int i = 0; i < rootIndexPages; i++)
                      rootIndices.Add(this.reader.ReadUInt32());
      
                  // Read the free page map
                  this.reader.BaseStream.Position = pageFlags * pageSize;
                  this.Pages.InitializeFreePageList(this.reader.ReadBytes((int)pageSize));
      
                  this.rootPageListStreamInfo = new StreamInfo(rootIndices.ToArray(), (uint)rootPageCount * 4);
      
                  // Finally actually read the root indices themselves
                  var rootPages = new List<uint>(rootPageCount);
                  using (var rootPageListStream = new PdbStream(this, this.rootPageListStreamInfo))
                  using (var pageReader = new BinaryReader(rootPageListStream))
                  {
                      for (int i = 0; i < rootPageCount; i++)
                          rootPages.Add(pageReader.ReadUInt32());
                  }
      
                  this.rootStreamInfo = new StreamInfo(rootPages.ToArray(), rootSize);
                  using (var rootStream = new PdbStream(this, this.rootStreamInfo))
                  {
                      var rootReader = new BinaryReader(rootStream);
      
                      uint streamCount = rootReader.ReadUInt32();
      
                      var streamLengths = new uint[streamCount];
                      for (int i = 0; i < streamLengths.Length; i++)
                          streamLengths[i] = rootReader.ReadUInt32();
      
                      var streamPages = new uint[streamCount][];
                      for (int i = 0; i < streamPages.Length; i++)
                      {
                          if (streamLengths[i] > 0 && streamLengths[i] < int.MaxValue)
                          {
                              uint streamLengthInPages = streamLengths[i] / pageSize;
                              if ((streamLengths[i] % pageSize) != 0)
                                  streamLengthInPages++;
      
                              streamPages[i] = new uint[streamLengthInPages];
                              for (int j = 0; j < streamPages[i].Length; j++)
                                  streamPages[i][j] = rootReader.ReadUInt32();
                          }
                      }
      
                      for (int i = 0; i < streamLengths.Length; i++)
                      {
                          this.streams.Add(
                              new StreamInfo(streamPages[i], streamLengths[i])
                          );
                      }
                  }
              }
          }
          private sealed class StreamInfo
          {
              private uint[] pages;
              private uint length;
      
              public StreamInfo(uint[] pages, uint length, bool dirty = false)
              {
                  this.pages = pages;
                  this.length = length;
                  this.IsDirty = dirty;
              }
      
              public uint[] Pages
              {
                  get => this.pages;
                  set
                  {
                      if (this.pages != value)
                      {
                          this.pages = value;
                          this.IsDirty = true;
                      }
                  }
              }
              public uint Length
              {
                  get => this.length;
                  set
                  {
                      if (this.length != value)
                      {
                          this.length = value;
                          this.IsDirty = true;
                      }
                  }
              }
              public bool IsDirty { get; private set; }
          }
      }
      
      posted in Support
      atripp
      atripp
    • RE: Symbol Server id issue

      Hi @it_9582 ,

      It's certainly possible; there's a few hundred lines of code that make up the MicrosoftPdbFile class, so I don't know which parts to share with you. Of course I'm happy to share it all if you'd like.

      Since you mentioned your colleague was able to read the file, perhaps you can share what you did, and I can see how it compares to our code?

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Symbol Server id issue

      Hi @it_9582 ,

      If there's an error reading the file using GetMetadataReader, then we load it using the MicrosoftPdbFile class that we wrote. So, I'm guessing that's what's causing the wrong information?

      Anyway, let me share you the full code for the PortablePdbFile class. I summarized it before, but you can see the full context of what we're doing any why.

      using System.Collections.Immutable;
      using System.IO;
      using System.Reflection.Metadata;
      
      namespace Inedo.ProGet.Symbols;
      
      public sealed class PortablePdbFile : IPdbFile
      {
          private readonly MetadataReader metadataReader;
      
          private PortablePdbFile(MetadataReader metadataReader) => this.metadataReader = metadataReader;
      
          // visual studio always treats this value like a guid, despite the portable pdb spec
          public ImmutableArray<byte> Id => this.metadataReader.DebugMetadataHeader.Id.RemoveRange(16, 4);
      
          // not really age, but actually last 4 bytes of id - ignored by visual studio
          uint IPdbFile.Age => BitConverter.ToUInt32(this.metadataReader.DebugMetadataHeader.Id.ToArray(), 16);
          bool IPdbFile.IsPortable => true;
      
          public IEnumerable<string> GetSourceFileNames()
          {
              foreach (var docHandle in this.metadataReader.Documents)
              {
                  if (!docHandle.IsNil)
                  {
                      var doc = this.metadataReader.GetDocument(docHandle);
                      yield return this.metadataReader.GetString(doc.Name);
                  }
              }
          }
      
          public static PortablePdbFile Load(Stream source)
          {
              if (source == null)
                  throw new ArgumentNullException(nameof(source));
      
              try
              {
                  var provider = MetadataReaderProvider.FromPortablePdbStream(source, MetadataStreamOptions.LeaveOpen);
                  var reader = provider.GetMetadataReader();
                  if (reader.MetadataKind != MetadataKind.Ecma335)
                      return null;
      
                  return new PortablePdbFile(reader);
              }
              catch
              {
                  return null;
              }
          }
      
          void IDisposable.Dispose()
          {
          }
      }
      
      posted in Support
      atripp
      atripp
    • RE: Proget 25.x and Azure PostGres

      Hi @certificatemanager_4002 ,

      From the cybersecurity perspective, it's fine to leave it as root since the core process is run by the non-root user postgres inside of the container. You're never exposing a network service while the containerized process has root privileges.

      Here is more information on this if you're curious:
      https://stackoverflow.com/questions/73672857/how-to-run-postgres-in-docker-as-non-root-user

      As you can see in that link, it's technically possible to configure as non-root, but it requires more effort and doesn't really get you any benefit.

      As for load-testing and restarting, it really depends on the hardware and similar factors. Keep in mind that InedoDB is simply the postgresql container image with some minor configuration tweaks/changes. So any question you ask about InedoDB you can really ask about postgresql as well.

      As for using an external PostgreSQL server, the only information we have at this time is in the link I sent you before. You really need to be an expert on PostgreSQL if you wish to run your own server.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: An error occurred in the web application: v3 index.json not found.

      It looks like someone (at IP 10.2.12.133) has configured the wrong URL in Visual Studio or something. They are trying to access the NuGet API via the wrong URL (note the /feeds vs /nuget at the base url).

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Update stats on MSSQL

      hi @sigurd-hansen_7559 ,

      ProGet runs EXEC sp_updatestats, which "runs UPDATE STATISTICS against all user-defined and internal tables in the current database."

      According to the documentation, "you must be the owner of the database (dbo)," so a dbo-user should have permission. Maybe they did something strange and somehow blocked it.

      Please note that the ProGet database user must be a db_owner, or ProGet will not function. This is a requirement starting in ProGet 2025.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Proget 25.x and Azure PostGres

      Hi @certificatemanager_4002 ,

      InedoDB is relatively new, so we're still working through some of the setup and installation documentation. We recently published a new release with installer improvements that address some default configuration issues.

      Otherwise, here is the documentation we have on using an External PostgreSQL server:
      https://docs.inedo.com/docs/installation/postgresql#external-postgres

      Note that we don't recommend it unless you have PostgreSQL server expertise - especially when it comes to Azure PostgreSQL, since they make certain customizations to the engine that will be difficult to troubleshoot/diagnose if you're not familiar with them. In particular when it comes to resource constraints and limiting/throttling usage.

      So, unless you have that in-house expertise, we suggest InedoDB.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Symbol Server id issue

      Hi @it_9582 ,

      I'm not really sure the differences between the files to be honest. But that could be possible?

      I know some of the PDB code is really ancient (like 30+ years) and it uses old .exe files that no other program can read. I'm not sure if that is true.

      This is the code we use, by the way:

      var provider = MetadataReaderProvider.FromPortablePdbStream(source, MetadataStreamOptions.LeaveOpen);
      var reader = provider.GetMetadataReader();
      if (reader.MetadataKind != MetadataKind.Ecma335)
          return null;
      reader.DebugMetadataHeader.Id //<-- that is how we get the PDB ID
      

      If that function is returning the wrong Id, then I don't know if we can solve the problem. It doesn't seem very feasible to write our own PDB parsing, and I don't think there's another way to do it...

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Download the cached packages to local machine

      Hi @Johnpeter-Commons_1617 ,

      Is this network totally air-gapped, or is the ProGet kind of like a DMZ?

      In general, you shouldn't be accessing the internal file store... but in air-gapped scenarios, sometimes you need to be creative. Here is where you can find the location of the packages

      https://docs.inedo.com/docs/proget/feeds/feed-overview/proget-feed-storage

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Symbol Server id issue

      Hi @it_9582 ,

      If I'm understanding correctly, the issue is that windbg is attempting to download the symbol with an ID of 0511335..., but the only symbol you're seeing in ProGet is 58544d5...?

      Since this is for C++, this is the Windows/PortablePdb format. In that case, ProGet is using the built-in class called MetadataReader to parse this information. I mean it's possible there's a bug in there, but I think it's more likely that it's the wrong file getting uploaded or something to that effect.

      As far as the URLS... I'm not sure what the correct one is, but windbg seems to try a whole bunch of URLS before it lands on the correct one in ProGet. But if you're seeing that 58544d5... symbol in ProGet, then it would be downloadable.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Is it possible to run PostgreSQL using ProGet.exe without writing a file to disk?

      Hi @dev_7037 ,

      We've actually just changed this and, in the upcoming maintenance release, you'll be able to specify - for the file name. When you do that, the query can be entered via stdin.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Symbol Server id issue

      Hi @it_9582 ,

      What version of ProGet are you using? There was a recent regression (PG-3204) that was fixed in ProGet 2025.19 with regards to symbol server. S hopefully upgrading will fix the issue.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: The SSL connection could not be established, see inner exception.

      Hi @jeff-williams_1864 ,

      I'm not quite sure why nuget.org would report using a self-singed certificate? That seems off, but it sounds like you're doing "something" with regards to certificates that I don't quite understand :)

      On that note, the /usr/local/share/ca-certificates volume store the certificates to be included in the container's certificate authority, which is used when connecting to a server with self-signed certificates: https://docs.inedo.com/docs/installation/linux/docker-guide#supported-volumes

      Hope that helps,

      Alana

      posted in Support
      atripp
      atripp
    • RE: ProGet license injection in AKS Pod

      hi @certificatemanager_4002 ,

      The 500 is occurring on /health because licenseStatus=Error and the software is basically unusable until you correct the license issue.

      You would see a similar "blocking" error in the ProGet UI as well - so just check that, and once you correct the license error, the health check will return to normal..

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: ProGet license injection in AKS Pod

      Hi @certificatemanager_4002 ,

      The license key is set via the UI, so you can browse/access the service as per normal. Then, you will prompted to do that right away when there is no key or it expired: https://docs.inedo.com/docs/myinedo/activating-a-license-key

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Unverified/not approved chocolatey package categorized with Vulnerabilities:None

      Hi @svc-4x9p2a_6341 ,

      First and foremost, Chocolatey does not incorporate "Vulnerabilities" (i.e. centrally aggregated reports of vendor-reported weaknesses in software) into the package ecosystem. This is just not something that's a part of the Windows ecosystem as a whole, unlike the Linux ecosystem (e.g. Ubuntu OVALs).

      Chocolatey does, however, perform automated malware/virus scanning on packages. That's a totally different thing... please read our How Virus Scanning in Chocolatey Works article to learn more.

      From a technical standpoint, ProGet will use (abuse?) the vulnerability subsystem to treat "flagged" packages as vulnerable. This was a "quick and dirty" way for us to experiment with exposing this data through ProGet without having to build an entirely new subsystem just for Chocolatey packages.

      As for crystalreports2008runtime, it did not fail the virus/malware checking, so it's not going to be seen as "vulnerable" by ProGet. Instead, it hasn't been "validated" by Chocolatey's automated system. That's a different feature altogether (i.e. unrelated to virus checking) - and that ancient crystal reports package long predates the moderation feature in Chocolatey I believe.

      In any case, ProGet does not expose nor allow users to "filter" on this validation status, and it's highly unlikely such a capability would add much value to users - especially considering no one has asked for it, and the cost of developing an entirely new, Chocolatey-only feature is nontrivial.

      The reason is that everyone internalizes their packages; see Why You Should Privatize and Internalize your Chocolatey Packages
      to learn more

      Hope that helps, maybe @steviecoaster can assist more.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Unverified/not approved chocolatey package categorized with Vulnerabilities:None

      Hi @svc-4x9p2a_6341 ,

      First and foremost, Chocolatey does not incorporate "Vulnerabilities" (i.e. centrally aggregated reports of vendor-reported weaknesses in software) into the package ecosystem. This is just not something that's a part of the Windows ecosystem as a whole, unlike the Linux ecosystem (e.g. Ubuntu OVALs).

      Chocolatey does, however, perform automated malware/virus scanning on packages. That's a totally different thing... please read our How Virus Scanning in Chocolatey Works article to learn more.

      From a technical standpoint, ProGet will use (abuse?) the vulnerability subsystem to treat "flagged" packages as vulnerable. This was a "quick and dirty" way for us to experiment with exposing this data through ProGet without having to build an entirely new subsystem just for Chocolatey packages.

      As for crystalreports2008runtime, it did not fail the virus/malware checking, so it's not going to be seen as "vulnerable" by ProGet. Instead, it hasn't been "validated" by Chocolatey's automated system. That's a different feature altogether (i.e. unrelated to virus checking) - and that ancient crystal reports package long predates the moderation feature in Chocolatey I believe.

      In any case, ProGet does not expose nor allow users to "filter" on this validation status, and it's highly unlikely such a capability would add much value to users - especially considering no one has asked for it, and the cost of developing an entirely new, Chocolatey-only feature is nontrivial.

      The reason is that everyone internalizes their packages; see Why You Should Privatize and Internalize your Chocolatey Packages
      to learn more

      Hope that helps, maybe @steviecoaster can assist more.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Universal Package Versioning

      hi @tyler_5201,

      For a case like this, I'd recommend using a custom metadata field like _vendorVersion or something like that? Of course, that's going to be relatively easy.

      The hart part is to "map" the vendor numbers to a SemVer. I would look at the data, and decide how you want to "pack" them into three segments.

      2024.3.201 might work, assuming there are less than 100 revisions per service pack. Or maybe 2024.302.1. The number is really just for you, so whatever makes sense to you :)

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Using curl to either check or download a script file in Otter

      Hi @scusson_9923 ,

      One idea ... how about a try/catch block?

      It's not great.... but the catch will indicate the file doesn't exist.

      Just a thought...

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Zabbix rpm feed not working correctly

      Hi @Sigve-opedal_6476 , we're currently investigating and will let you know more later this week

      posted in Support
      atripp
      atripp
    • RE: Vulnerability checking on Maven packages

      Hi @davi-morris_9177 ,

      Unfortunately, the source data for these particular vulnerabilities specify invalid version numbers. A valid Maven version is a 5-part number consisting of 1-3 integer segments (separated by a .), an optional build number (prefixed with a -), and then an optional qualifier (another -). Following these rules, 2.9.10.8, is invalid.

      Valid versions are semantically sorted, where as invalid versions are alphabetically sorted -- which is what's causing the big headache here, since "2.21.1" < "2.9.10.8" when you sort alphabetically.

      At this time, we don't have any means to "override / bypass" source data, and rewriting/updating our Maven version parsing for just a small corner case (i.e. these old/irrelevant vulnerabilities in particular) doesn't seem worthwhile.

      As such, for the time being, your best solution is just to "Ignore" these vulnerabilities via an assessment. They are totally irrelevant now, not just because they refer to ancient versions, but there is simply no realistic real-world exploit path: https://cowtowncoder.medium.com/on-jackson-cves-dont-panic-here-is-what-you-need-to-know-54cd0d6e8062

      FYI - for ProGet 2026, we are working on a lot of improvements in vulnerability management that will reduce the noise of these non-exploitable vulnerabilities so teams can address actual risk and focus on delivering value instead of constant patching.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Layer Scanning is not working with images which is pushed with --compression-format zstd:chunked

      Hi @geraldizo_0690 ,

      Nice find with the busybox image... that makes it a lot easier to test/debug on our end!!

      We already have a ZST library in ProGet so, In theory, it shouldn't be that difficult to use that for layers like this. We'll add that via PG-3218 in an upcoming maintenance release -- currently targeting February 20.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Layer Scanning is not working with images which is pushed with --compression-format zstd:chunked

      Hi @geraldizo_0690 ,

      Are you seeing any errors/messages logged like, Blob xxxxxxx is not a .tar.gz file; nothing to scan.? If you go to Admin > Executions, you may see some historic logs about Container scanning.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Zabbix rpm feed not working correctly

      Hi @Sigve-opedal_6476 ,

      Could you give some tips/guidance on how to repro the error? Ideally, it's something we can see only in ProGet :)

      It's probably some quirk in how they implement things, but I wanted to make sure we're looking at the right things before starting.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Using curl to either check or download a script file in Otter

      Hi @scusson_9923 ,

      That is an internal/web-only API url, so it wouldn't behave quite right outside a web browser.

      I can't think of an easy way to accomplish what you're looking to do.... if you could share some of the bigger picture, maybe we can come up with a different approach / idea that would be easier to accomplish.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: InitContainers never start with Azure Sql on ProGet 25.0.18

      Hi @certificatemanager_4002 ,

      I'm sorry but I'm not familiar enough with Kubernetes to help troubleshoot this issue.

      All that I recognize here is the upgradedb command, which is documented here:
      https://docs.inedo.com/docs/installation/linux/installation-upgrading-docker-containers#upgrading-the-database-only-optional

      If you run that command from the command-line (on either linux or windows), things will written to the console. I wish I could tell you why you aren't seeing the messages.

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Proget apt snapshot support?

      Hi @phil-sutherland_3118 ,

      This is not on a roadmap. Honestly we really don't really understand what a "snapshot" repository is or how they are used.

      We surveyed some customers about it a while ago, and this summarizes what they said: repository snapshots are archaic; they made sense a long time ago, but Docker changed all that. It's so much simpler to use container images like FROM debian:buster-20230919. That's effectively our snapshot, and when we need to main old releases (which happens more often than I'd like), we just rebuild the image from that. The other big advantage is that build time is easily 10x faster if not more.

      And then we saw that Debian also to maintains their own snapshots (https://snapshot.debian.org/), so we don't quite get how they are used outside of a handful of use cases (like a build process for a specialized appliance OS without Docker).

      Anyway we're open to considering it.... but only two people (including you) have asked in the past several years, so there's no real interest... and we're not sure what they even do :)

      That said, it's possible there's a way to accomplish something that has the same outcomes. For example:

      • create a public aggregate feed (jammy-all) with multiple connectors to Debian, Ubuntu, NGINX, Elasticsearch, etc.
      • create a release feed (jammy-20231101) that snapshots jammy-all

      But we don't know enough to answer that :)

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • 1
    • 2
    • 3
    • 4
    • 5
    • 37
    • 38
    • 1 / 38