Navigation

    Inedo Community Forums

    Forums

    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    1. Home
    2. atripp
    3. Posts
    • Profile
    • Following
    • Followers
    • Topics
    • Posts
    • Best
    • Groups

    Posts made by atripp

    • RE: Buildmaster - High CPU database since 6.2.22

      Thanks @philippe-camelio_3885

      So, the good news is, we've identified the problem. There was just a huge number of manual executions happening, for some reason, and the manual execution purging routine could never catch up. Changing those throttles wouldn't make a difference I'm afraid, as none will trigger a manual execution...

      At first, can you please share the results of this query, so we can see what made all those?

      SELECT [ExecutionType_Name], COUNT(*) FROM [ManualExecutions] GROUP BY [ExecutionType_Name]

      That will tell us what Manual Executions are around, mostly so we can understand what it is. I suspect, infrastructure sync.

      That being said... the first thing I'm now seeing is that the report looks old. It's because the number of rows is 164,125, which is the exact same number as from before. So, I'm thinking, actually, you didn't commit the transaction in the query I posted before? It included a ROLLBACK statement as a safety measure... that's my fault, I should have said to only run DELETE if you were satisfied.

      Since the query seems okay (it reduced rows from 164K down to 1k), please run this:

      DELETE [Executions]
        FROM [Executions] E,
           (SELECT [Execution_Id], 
                   ROW_NUMBER() OVER(PARTITION BY [ExecutionMode_Code] ORDER BY [Execution_Id] DESC) [Row]
             FROM [Executions]
            WHERE [ExecutionMode_Code] IN ('R', 'M', 'T')) EE
      WHERE E.[Execution_Id] = EE.[Execution_Id]
        AND EE.[Row] > 1000
      

      From here, it should actually be fine...

      posted in Support
      atripp
      atripp
    • RE: Connection reset while downloading npm packages

      @mathieu-belanger_6065 thanks for all of the diagnostic and additional information. I think you're right, it's environment / network specific, and not related to ProGet.

      I would check the ProGet Diagnostic Center, under Admin as well.

      Otherwise, proGet doesn't operate at the TCP-level, but uses ASP.NET's network stack. There's really nothing special about how NPM packages are handled, compared with other packages, and we haven't heard of any other issues regarding this.

      For reference, here's code on how a package file is transmitted. Note that, if you're using connectors and the package isn't cached on ProGet, then each connector must be queried. This can yield quite a lot of network traffic.

                  if (metadata.IsLocal)
                  {
                      using (var stream = await feed.OpenPackageAsync(packageName, metadata.Version, OpenPackageOptions.DoNotUseConnectors))
                      {
                          await context.Response.TransmitStreamAsync(stream, "package.tgz", MediaTypeNames.Application.Octet);
                      }
                  }
                  else
                  {
                      var nameText = packageName.ToString();
      
                      var validConnectors = feed
                          .Connectors
                          .Where(c => c.IsPackageIncluded(nameText));
      
                      foreach (var connector in validConnectors)
                      {
                          var remoteMetadata = await connector.GetRemotePackageMetadataAsync(packageName.Scope, packageName.Name, metadata.Version.ToString());
                          if (remoteMetadata != null)
                          {
                              var tarballUrl = GetTarballUrl(remoteMetadata);
                              if (!string.IsNullOrEmpty(tarballUrl))
                              {
                                  var request = await connector.CreateWebRequestInternalAsync(tarballUrl);
                                  request.AutomaticDecompression = DecompressionMethods.None;
                                  using (var response = (HttpWebResponse)await request.GetResponseAsync())
                                  using (var responseStream = response.GetResponseStream())
                                  {
                                      context.Response.BufferOutput = false;
                                      context.Response.ContentType = MediaTypeNames.Application.Octet;
                                      context.Response.AppendHeader("Content-Length", response.ContentLength.ToString());
      
                                      if (feed.CacheConnectors)
                                      {
                                          using (var tempStream = TemporaryStream.Create(response.ContentLength))
                                          {
                                              await responseStream.CopyToAsync(tempStream);
                                              tempStream.Position = 0;
      
                                              try
                                              {
                                                  await feed.CachePackageAsync(tempStream);
                                              }
                                              catch
                                              {
                                              }
      
                                              tempStream.Position = 0;
                                              await tempStream.CopyToAsync(context.Response.OutputStream);
                                          }
                                      }
                                      else
                                      {
                                          await responseStream.CopyToAsync(context.Response.OutputStream);
                                      }
      
                                      return true;
                                  }
                              }
                          }
                      }
                  }
      
      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      The Execution_Configuration column of the ManualExecutions table will give a clue; it's XML but if you expand the coluumn, you'll see the name of the manual execution.

      It's only supposed to log if something changed, however...

      If there's a bug, one way to check would be to disable infrastructure sync, for the time being.

      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      If I'm understanding correctly, did your the Manual Execution records go from 1000 to 164,000 in just a few days? If so, that would explain a lot....

      These are the types of so-called Manual Executions:

      • Importing or Exporting Applications
      • Cloning and Applying Template to Applications
      • Sync of Issue Sources
      • Deploying Configuration file
      • Upgrading Inedo Agents
      • Sync infrastructure

      They are supposed to only occur on a manual basis, like when you trigger something from the UI so you can get logs. Or, in the case of sync infrastructure, whenever infrastructure changes.

      Any idea what all the manual executions could be?

      posted in Support
      atripp
      atripp
    • RE: [Otter]Server restart failed

      Hi @Adam1 ,

      The Restart-Server operation is performed on the server itself, using the Inedo Agent or PowerShell Agent.

      Behind the scenes, the agent will just use the advapi32.dll::InitiateShutdown Win32 API method, and that error string indicates that Windows is returning ERROR_ACCESS_DENIED when attempting to initiate the Shutdown. This is the same method that shutdown.exe uses behind the scenes as well.

      So basically, just make sure that the agent process is running as an admin/system account.

      Best,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      How often is this happening? It shows 102 executions were purged, and based on the I/O there was a lot of logs deleted... this can be actually quite resource-intensive, as there are a lot of log data.

      But this usually happen during off-hours, etc., so it shouldn't be disturbing.

      posted in Support
      atripp
      atripp
    • RE: OTTER 3.0 - Git Based-Raft ?

      @Joshua_1353 did this work in Otter v2?

      The "too many redirects/auth requests" is usually a kind of red herring, and refers to some sort of configuration problem (corrupt local repository, cached credentials, etc.). We'd need to see the whole stack trace --- but could you post it to a new Topic, so we can track it differently?

      I don't think it's related to v3. The reason it didn't show in v3 was (we just forgot to tag it properly after some coding refactoring changes in Otter).

      posted in Support
      atripp
      atripp
    • RE: OTTER 3.0 - Git Based-Raft ?

      Thanks @Joshua_1353! Looks like this was a minor configuration change, where that particular repository type wouldn't load in Otter v3. I added a missing attribute, and rebuilt, so now he seems to be displayed in the list.

      Easy fix, if you download latest Git extension (1.10.1).

      posted in Support
      atripp
      atripp
    • RE: Functional differences between different "Feed Usage" options

      Hi @Stephen-Schaff_8186,

      Thanks for the clarifications! In fact, I wanted to learn some of the behavior, and here's what I discovered.

      I'm sharing the details, because I think we should take the opportunity to clarify not only the docs, but the UI, since it seems like this can be improved. It's a new concept in ProGet 5.3, and it was primarily intended to guide set-up of new feeds, so we haven't looked at it closely since first adding the feature.

      Feed Type Sets

      There are two sets of feed type options, and which ones are displayed is dependent upon whether the feed type is denoted as having a private gallery (HasPublicGallery).

      HasPublicGallery == true

      • "free/open source packages"
      • "private/internal packages"
      • "validated/promoted packages"
      • "mixed public/private packages"

      HasPublicGallery == false

      • "private/internal packages"
      • "validated/promoted packages"

      These all map to an enum: Mixed = 0, PrivateOnly = 1, PublicOnly = 2, Promoted = 3.

      HasPublicGallery

      The following feed types are denoted (internally) as having an official, public gallery: Chocolatey, Cran, Maven, Npm, NuGet, PowerShell, Pypi, RubyGems.

      • Helm and Docker are not on this list, perhaps because there's no official gallery? I'm not sure.
      • Debian and RPM are not on this list, because I don't think they support connectors

      Feed Type Behavior

      Almost all of the behavioral changes occur in the "out of box tutorial", to guide users through the setup. Aside from that, here's the UI impact I found:

      FeedType == PublicOnly

      • On the list packages page (e.g. /feed/MyFeed):

        • the "package filter info" is displayed as "Unfiltered", even if no package filters are configured to bring visibility to the importance of package filters
        • the "vulnerability status" is displayed as "Not Scanned", even if vulnerability scanning is not configured
      • On the Package Versions page (e.g. /feed/MyFeed/MyPackage/versions):

        • the "vulnerability status" is displayed as "Not Scanned", even if no vulnerabilities are detected

      FeedType == PrivateOnly

      • Feed allows AllowUnknownLicenseDownloads, regardless of global setting; this feels like a big behavioral change, but it makes sense, since why would you license your own packages, etc.
      • The Manage License Filter page displays an error.
      • On the Package Overview page (/feed/MyFeed/MyPackage/1.2.3), the license information box is not displayed
      • On the List Package Versions page (/feed/MyFeed/MyPackage/versions), the license information box is not displayed

      FeedType == Promoted

      • On the List packages page (/feed/MyFeed), the add Package Button is disabled

      FeedType == Mixed

      No UI changes.

      Next Steps?

      Well, that's everything. Any opinions / suggestions?

      I'm not sure why the Add Package button is disabled. Of course you can still use API, or even navigate directly to the page. Perhaps a warning on the Add Package Page would be better?

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Proget 5.0.10 docker with MSSQL server

      This upgrade path isn't supported, and ProGet 5.0.1 does not work on SQL Server.

      Your best route for upgrade is ProGet 5.0 > ProGet 5.3. Then, migrate to ProGet for Linux.

      posted in Support
      atripp
      atripp
    • RE: ProGet - Use Connector filters like package search

      Hello;

      That search syntax is really only supported by NuGet v3 API, I think; so, ProGet simply forwards on the query to that API, and returns the results.

      But regardless, connector filters need to be applied after the remote feed returns results, because connector filter logic can be more complex that what is supported by the various feed APIs (you can allow Microsoft.* and Inedo.* for example).

      More advanced connector filter options are definitely something we've considered, and we'd love to do things like "version: 3.*" for example. But, it's a lot more complicated under the hood, and probably isn't even feasible given the nature of feeds.

      Alana

      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      My bad, can you try this instead? Basically we are trying to delete all "R, M, T" executions except the most recent 1000 of each type.

      This is what the code is doing now, just really inefficiently for some reason -- and the inefficiency seems to have caused a "backlog" of sorts.

      USE [BuildMaster]
      
      BEGIN TRANSACTION
      
      DELETE [Executions]
        FROM [Executions] E,
           (SELECT [Execution_Id], 
                   ROW_NUMBER() OVER(PARTITION BY [ExecutionMode_Code] ORDER BY [Execution_Id] DESC) [Row]
             FROM [Executions]
            WHERE [ExecutionMode_Code] IN ('R', 'M', 'T')) EE
      WHERE E.[Execution_Id] = EE.[Execution_Id]
        AND EE.[Row] > 1000
      
      SELECT [ExecutionMode_Code], COUNT(*) FROM [Executions] GROUP BY [ExecutionMode_Code]
      
      ROLLBACK
      
      posted in Support
      atripp
      atripp
    • RE: OTTER 3.0 / Fresh install Install with remote DB from InedoHub failed on non-US system

      Hello;

      We haven't seen this error in quite a long time, and I remember it was something we had addressed ages ago. Internally, we still jokingly call it NT AUTORIDAD from this experience, because who would have guessed they localized those names...

      That said, we didn't really change the installation process for Otter 3.0, and it's really just a copy/paste of the ProGet and BuildMaster installation scripts. I don't know why we wouldn't have heard from it until now.

      It's possible this solution was never brought over to the Inedo Hub? Maybe.. for now, we'd rather not mess with the installation scripts until we get more reports, so we'll just wait to hear more. If anyone else experiences this , in any products, please share it :)

      Alana

      posted in Support
      atripp
      atripp
    • RE: [ProGet] Manual database upgrade (docker, kubernetes)

      Thanks @viceice !!

      I think we're really close, actually. You're right, the service code (copied below), sets the version, so this wouldn't work anyways.

      However, I think if we just write-out a script at build time that does something like EXEC Configuration_SetValue 'Internal.DbSchemaVersion', $VersionNumber... and then incluse that in SqlScripts.zip`, it would always work.

      Ultimately what I'd love to do is build a guide like the Docker Compose Installation Guide, but have it be the Kubernetes Installation Guide.

      If we get the version-number-setter script included in SqlScripts.zip, how close do you think we'll be to a Kubernetes install guide? Would it basically be the same as that Compose guide, but just have k8-commands and k8-sample code instead?

              private static int UpdateDatabaseSchema()
              {
                  Console.WriteLine($"ProGet version is {typeof(Program).Assembly.GetName().Version}.");
                  var currentVersion = getCurrentVersion();
                  Console.WriteLine($"Current DB schema version is {currentVersion?.ToString() ?? "unknown"}.");
                  if (currentVersion == typeof(Program).Assembly.GetName().Version)
                      return 0;
      
                  using (var p = Process.Start(getStartInfo()))
                  {
                      p.WaitForExit();
      
                      if (p.ExitCode == 0)
                          setCurrentVersion();
      
                      return p.ExitCode;
                  }
      
                  static ProcessStartInfo getStartInfo()
                  {
      #if NET452
                      return new()
                      {
                          FileName = "mono",
                          Arguments = $"/usr/local/proget/db/inedosql.exe update /usr/local/proget/db/SqlScripts.zip --connection-string=\"{SharedConfig.ConnectionString}\""
                      };
      #else
                      return new()
                      {
                          FileName = "/usr/local/proget/db/inedosql",
                          ArgumentList =
                          {
                              "update",
                              "/usr/local/proget/db/SqlScripts.zip",
                              $"--connection-string={SharedConfig.ConnectionString}"
                          }
                      };
      #endif
                  }
      
                  static Version getCurrentVersion()
                  {
                      try
                      {
                          var s = DB.Configuration_GetValue("Internal.DbSchemaVersion")?.Value_Text;
                          Version.TryParse(s, out var v);
                          return v;
                      }
                      catch
                      {
                          return null;
                      }
                  }
      
                  static void setCurrentVersion()
                  {
                      try
                      {
                          DB.Configuration_SetValue("Internal.DbSchemaVersion", typeof(Program).Assembly.GetName().Version.ToString());
                      }
                      catch
                      {
                      }
                  }
              }
      
      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      Thanks Philippe!! That's helpful.

      Can you try this sql?

      USE [BuildMaster]
      
      BEGIN TRANSACTION
      
      DELETE [Executions]
       WHERE [ExecutionMode_Code] IN ('R', 'M', 'T')
         AND ROW_NUMBER() OVER(PARTITION BY [ExecutionMode_Code] ORDER BY [Execution_Id] DESC) > 1000
      
      SELECT [ExecutionMode_Code], COUNT(*) FROM [Executions] GROUP BY [ExecutionMode_Code]
      
      ROLLBACK
      

      You should then see results like this:

      R 1000
      M 1000
      S 1560
      T 377
      B 999
      

      You can further inspect the tables, but this should do the trick. If the results look okay, then please run only the DELETE statement and then it will be fine.

      Can you let me know if it works? we will incorporate this logic in BM-3659

      posted in Support
      atripp
      atripp
    • RE: OTTER 3.0 - Agent type SSH

      Hello; thanks for reporting the bug! This was a UI regression, and the HostName field was not being dispayed on the edit form. So it's already fixed in code (via OT-384), and we'll get a new release out quite soon, perhaps a day or so

      Thanks,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      Hey @philippe-camelio_3885

      You may be looking at the wrong database... based on Otter in your query.

      Here's a way to see executions by execution type: SELECT [ExecutionMode_Code], COUNT(*) FROM [BuildMaster]..[Executions] GROUP BY [ExecutionMode_Code]

      As for retention policies, you should be able to see the logs of those., and see what's being purged.

      Anwyays, w'ell figure it out... hang tight!

      posted in Support
      atripp
      atripp
    • RE: [ProGet] Manual database upgrade (docker, kubernetes)

      Hi @viceice , we don't have an Official kubernetes deployment for ProGet yet, but working with the community on figuring out how to deploy this way is how we get there. eventually we'd like to offer ProGet Enterprise via Kubernetes and enable load balancing, etc.

      I'm not so familiar with Docker or Kubernetes to be honest, but I can answer all of questions so we can figure it together.

      Is there any ways to run the proget docker image to only upgade the database and then exit?

      Not that I can think of. Here's how it works behind the scenes:

      • On ProGet (for Windows), the database is initialized/upgraded at install time using a tool called inedosql: https://docs.inedo.com/docs/proget/installation/installation-guide/manual#database

      • On ProGet for Linux, the ProGet Service is responsible for upgrading the database on startup using the same mechanism (i.e. inedosql). This was done to simplify the Dockerfile.

      I'm not sure what an InitContainer is, but based on the name, I guess it's a container that exists only to initialize a cluster? Technically we could do several things:

      • run inedosql on Linux
      • add a commandline argument to the ProGet service to terminate after doing a database upgrade

      I don't know what would be easier, or better. What do you think?

      posted in Support
      atripp
      atripp
    • RE: OTTER 3.0 - Git Based-Raft ?

      Git-based Rafts are part of the Git extension, so if you're not seeing them when creating a new Raft (or they've disappeared), then there is probably an extension load error.

      Do you see Git under Admin > Extensions?

      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      We can definitely try to diagnose what's going on. What are the types of Executions in that table? Manual? Build? Etc?

      posted in Support
      atripp
      atripp
    • RE: How to enable Semantic Versioning for Containers

      Hello; you should see a checkbox, like this.

      9e8d5747-9278-4c84-8762-681af6985b55-image.png

      However, due to a bug (now fixed via PG-1885) the checkbox wasn't being displayed due to a license validation problem.

      It will be fixed in the next maintence release, scheduled for later today

      posted in Support
      atripp
      atripp
    • RE: Functional differences between different "Feed Usage" options

      Hello, great question!

      I hope that I can answer your question by showing you what I changed in the documentation

      Feed usage controls which tabs and messages are displayed in the user interface. For example, "Private/Internal packages" won't display the license filtering options, as you wouldn't create license usage restrictions for your own packages.

      Note that not all feeds have all of these Feed Usage options. Generally speaking, we don't recommend using mixed packages, as it will present all of the user interface options; most of them won't be relevant for packages you create (like license filtering or vulnerability scanning).

      posted in Support
      atripp
      atripp
    • RE: ProGet as ClickOnce publish target?

      I'm not so familiar with ClickOnce publish targets, but my understanding is that ClickOnce is deployed to an ordinary web-based file directory, like a site in IIS that has "file browsing" enabled or something?

      IF so, then I think Asset Directory would work. That's kind of like that, and is meant for general files. Feeds require a special API to access.

      Let us know how it goes.

      posted in Support
      atripp
      atripp
    • RE: do I scale devops

      Hello; I'm not sure if we can help here, this sounds like something more appropriate for Azure DevOps community.

      posted in Support
      atripp
      atripp
    • RE: Multipart body length limit 134217728 exceeded.

      Hello;

      This is apparently a default limitation in .NET5/Core; I'm not sure if it can be changed outside of the code, but I've logged a product change PG-1876 to get this fixed in the next maintence release.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: No Bulk Import for Maven Feed

      @andrew_5903 thanks, I'd like to update the docs! Did you end up writing a script to just call that for each Jar+Pom in your directory?

      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      Typically these are cleaned up via retention policies... but where is the DELETE [Executions] WHERE [Execution_Id] = @Execution_Id being called?

      posted in Support
      atripp
      atripp
    • RE: ProGet - Feature Request - End user setup button for a feed

      Hello; we haven't forgotten about this, just haven't had an opportunity prioritize with everything else. Hopefully we'll hear from some other users who are seeking similar functionality

      posted in Support
      atripp
      atripp
    • RE: No Bulk Import for Maven Feed

      Hi Andrew,

      The only way to get a Maven artifact in ProGet is by using the API to push it. The re-indexing can detect missing files on disk, but it does not add new files.

      This is by design, we simply never created this functionality for ProGet. Even in old versions.

      We may add it later, but there's not a lot of demand, since it's "relatively easy" to write a script to import everything in via the API. Really it's just doing a number of HTTP Push commands by starting wtih the POM files as i mentioned.

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: No Bulk Import for Maven Feed

      Hello;

      As of ProGet 5.3.17, ProGet does not support drop paths for Maven; the precise reason is complex, but it's ultimately a result of the way Maven is structured (i.e. a file system vs packages), and how files can be added piece-meal.

      It is something we are considering adding in a future release, but it's complex. Other users have reported that they've used a process like this for a disk-based repository.

      1. Traverse all directories and upload all POM files with a path relative to the root
      2. Traverse all directories again, and upload all non-POM and non-checksum files (like .md5)

      There will be errors, particularly if you have invalid POM files or your directory structure doesn't match the required MAVEN convention, so inspect those case-by-case to determine if it matters (like an bad artifact from 5 years ago can probably be ignored).

      Unfortunately we don't have a lot more details, but we'd love to hear more from your experience if you end up scripting this,

      Cheers,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Buildmaster - High CPU database since 6.2.22

      Hi Philippe,

      I don't think there's anything in particular that would have changed that would cause this slowness.

      Those tables are big, and I would definitely recommend purging old data from them.

      In any case, the problem is likely index fragmentation. Deleting an execution will cascade to those tables, so I'm thinking there might be some kind of indexing fragmentation or problem.

      Here's some information about how to detect and repair index fragmentation: https://inedo.com/support/kb/1167/sql-server-recommendations

      Can you try that, and see if it helps?

      Thanks, Alana

      posted in Support
      atripp
      atripp
    • RE: Create API key per packages

      ProGet does not support package-level permissions; you would need to create separate feeds, or create a custom package filter (in C#) that could have some sort of filtering logic per user.

      posted in Support
      atripp
      atripp
    • RE: Validate chocolatey checksum before installing package

      Hi @Crimrose, this would probably be better to check on the Chocolatey forums / community site. But if you can find the answer, please share it :)

      posted in Support
      atripp
      atripp
    • RE: Nuget packages not indexed automatically in symbol server when pushed (only manual reindex works)

      Thanks; I can confirm that there indeed is an error parsing one of the PDBs in the file you sent, which is why it won't index on upload. It could be just that particular package or file, though. I wonder if you try different packages, it might work.

      Otherwise, not sure why/how it would work from the service though...

      Anyways we'll investigate this further, but it might take several days to schedule the time, since this is quite complex, binary/bit-level parsing logic, and might be complex.

      posted in Support
      atripp
      atripp
    • RE: Nuget packages not indexed automatically in symbol server when pushed (only manual reindex works)

      Hello; the files should definitely be indexed on upload...

      Could you share the package you created created, so that we could evaluate/test to see if we can reproduce the problem? I'm not sure if you can attach to the forums, but you could also email it to support at inedo dot com ... please add [QA-475] in the subject so we can easily find it.

      posted in Support
      atripp
      atripp
    • RE: Create apt mirror

      Hello; due to the complexity of handing the apt signing, connectors to other apt repositories are not supported for Debian Feeds in ProGet at this time; we didn't anticipate this being a common usecase.

      You could, however, publish .deb packages you've downloaded to the repository manually.

      posted in Support
      atripp
      atripp
    • RE: ProGet V5.3.17 Excessive Database Connections

      hi Simon,

      The default connection pool size for .NET's SQL Server connection is 100, so that behavior wouldn't be unexpected if there were a lot of requests to ProGet (very common in a package restore) and the database driver wasn't responding fast enough for whatever reason (also very common... too many network requests coming in, too many network requests going out, slow database server, etc.).

      The .NET Runtime will eventually close those connections, and they will then be terminated by the TCP stack shortly after (minutes?).

      Alana

      posted in Support
      atripp
      atripp
    • RE: API Key "impersonate user" doesn't work when impersonating an LDAP user

      Hi Simon, I'm guessing there's a 500 error being thrown at the same time? Or, perhaps, it's a permission error? In any case, can you try to get a feeling for the underlying error message? That will help debug it.

      posted in Support
      atripp
      atripp
    • RE: Nuget connection timeout

      Hi Peter,

      Thanks for reposting this; this error is basically what you can expect from a "network stack" being overloaded; each request to a ProGet feed can potentially open another requests to each connector configured (maybe NuGet.org, maybe others), and if you are doing a lot of requests at once, you'll get a lot of network activity queuing up. SQL Server also running on the network, so those just get added to the queue, and eventually you run out of connection.

      Do you have a lot of containers running on the same computer that ProGet is running on? If so, that coudl also contribute to the network stack being overloaded.

      The best way to solve is with more hardware, or by removing connectors to nuget.org, etc.

      Also note that users can overload a server as well, and this is why load balancing will be very important if you want to keep reliability; See How to Prevent Server Overload in ProGet to learn more.

      posted in Support
      atripp
      atripp
    • RE: Nuget Feed Connection timeout

      Hi @p-bruch_5023 can you post this to a Topic? It'll be a lot easier to track internally, and for other users to find since it's newer and different cases than other timeouts.

      posted in Support
      atripp
      atripp
    • RE: Error Trying to add new User Directory (Active Directory LDAP)

      Hi Simon, based on what I'm seeing, it looks like the InedoCore extension didn't load, which is what would happen in this case...

      Can you try restart container?

      On the Admin > Extensions> What do you see on the InedoCore page? There should be a list of extensible components presented, including the directory type...

      posted in Support
      atripp
      atripp
    • RE: Proget Sometimes Truncates Package Version

      These days, there are very few packages with quicks like this, so we recommend just repackaging (either using the feature, or manually) a local copy for caching purposes.

      Owin is one of these packages unfortunately. The latest (and only) package version is 1.0 (in some cases) and 1.0.0 in others. It's also over 8 years old, so it can't be helped.

      ProGet 5.2 had better support for these versions quirks using a feature called "Legacy (Quirks) Feeds", but those feeds couldn't handle SemVer queries properly, since requests like 1.0 were ambiguous (should it return 1.0 or 1.0.0, both of which are valid versions, etc), so it was a big tradeoff. They were fully removed in ProGet 5.3.

      posted in Support
      atripp
      atripp
    • RE: BuildMaster: "server too busy"

      You can basically ignore this error.

      When you press the Save button on the All Settings page, it triggers a Restart of the Application Pool after saving. Most of the time this is not noticed at all, but sometimes it happens -- and this might happen. It goes away within seconds if you hit refresh.

      posted in Support
      atripp
      atripp
    • RE: Proget Sometimes Truncates Package Version

      Hello;

      Long story short, the problem is that your package's nuspec file has an invalid version number (1.0); if you edit the file, and put in a proper SemVer, it will be fine. This is what repackaging does, by the way. It creates a new package with a different version number.

      This is a long-standing versioning quirk with NuGet that still comes up every now and then; in the old days, you could have packages like 1.0 and 1.0.00 or even 1.000.0, and they'd all be different.

      NuGet dropped support for this over five years ago, but since old packages with quirk versions remain, they did all sorts of strange work-arounds in the NuGet client. For example, you may see a file request for 1.0.0, and then 1.0.0.0, then 1 then 1.0.

      This is because the NuGet API only shows a three-part SemVer anymore, but the files are still accessed by their original version number. ProGet does not do the "version dance" to find the real package file, which is why you get these errors.

      We eventually dropped most support for this versioning, and basically you can just have "some" quick packages in feeds, but they won't work through connectors in most cases.

      posted in Support
      atripp
      atripp
    • RE: ProGet net5.0 docker run in centos 7.8 web can't start(5.3.17)

      Hi @scroak_6473 good to know! So, I've added the WORKDIR /usr/local/proget/ line right above our CMD line, and it should go in the next release.

      posted in Support
      atripp
      atripp
    • RE: ProGet Query Latest Docker Image Tag

      Hello;

      Tags in Docker registries are really just a human-readable pointer to a digest (hash) of an container image. It's really just a name+digest, and there's no additional metadata provided by the Docker API.

      This is why we encourage Semantic Versioning for Containers, and have a feature built-in that helps with this. You can then reliably parse those tags like semnatic version numbers, and use them as needed.

      The Packages vs Containers documentation also talks about some of the quirks, if you're not familiar with them already.

      posted in Support
      atripp
      atripp
    • RE: BuildMaster: Moving from one server to another

      Hi Sri,

      At first, you can just replace the license key (Admin > License Key) to change the edition of the software (From Express to Enterprise). There's no need to migrate from one server to another.

      However, if you want to migrate from one server to another for a different reason, then there are two general approaches:

      • application-by-application (this involves using the import/export feature), and is ideal for when you only want to migrate limited, application-specific data
      • full migration of all data, using the back-up / restore instructions

      The Backing Up BuildMaster instructions detail this, but basically you just need to have three things:

      • BuildMaster Database; a SQL Server database that contains all of BuildMaster's configuration data
      • Encryption key; to encrypt/decrypt sensitive data in the database (like credentials); this is stored in the shared configuration file
      • Artifact Library Files; the path on disk (defined in Artifacts.BasePath setting) that contains all the files for artifacts you created within BuildMaster

      Hope that helps,
      Alana

      posted in Support
      atripp
      atripp
    • RE: Agent initiated connection?

      I don't have a detailed timeline with Otter 3.0, the scope appears to continue to creep (but, perhaps in a good way 🤔). But it's still looking on track for early next year.

      Probably the best thing to do is to get in touch with our sales engineering team, so we can learn a bit more about what problems you're trying to solve, and can give some more details about how Otter 3.0 will help; they can at least show you what's upcoming, so you can better decide if it's a good fit.

      posted in Support
      atripp
      atripp
    • RE: ProGet net5.0 docker run in centos 7.8 web can't start(5.3.17)

      Hello, thank you for helping to identify this issue, and how we can solve it.

      Just so I can understand the situation, I want to confirm that setting the WORKDIR will allow ProGet to run on CentOS7?

      If so, do you think editing our Dockerfile like below (see the line I added) will fix the issue? It seems something easy we can try in the next release then.

      FROM mcr.microsoft.com/dotnet/aspnet:5.0.0
      
      EXPOSE 80
      
      COPY proget/ /usr/local/proget/
      
      ****** ADD THIS LINE ****
      WORKDIR /usr/local/proget/
      ****** / ADD THIS LINE ****
      
      ENV SQL_CONNECTION_STRING "Data Source=proget-sql; Initial Catalog=ProGet; User Id=sa; Password=;"
      ENV PROGET_SVC_MODE both
      
      VOLUME /var/proget/packages
      VOLUME /var/proget/extensions
      VOLUME /usr/share/Inedo/SharedConfig
      
      CMD ([ -f /usr/share/Inedo/SharedConfig/ProGet.config ] || echo '<?xml version="1.0" encoding="utf-8"?><InedoAppConfig><ConnectionString Type="SqlServer">'"`$SQL_CONNECTION_STRING"'</ConnectionString><WebServer Enabled="true" Urls="http://*:80/"/></InedoAppConfig>' > /usr/share/Inedo/SharedConfig/ProGet.config) \
      && exec /usr/local/proget/service/ProGet.Service run --mode=`$PROGET_SVC_MODE --linuxContainer
      

      Thanks, pleas let me know

      posted in Support
      atripp
      atripp
    • 1
    • 2
    • 23
    • 24
    • 25
    • 26
    • 27
    • 34
    • 35
    • 25 / 35