Hi @caterina ,
It looks like there was an error processing the template (i.e. what's on the customize webhook tab). Can you share that, and maybe we can spot it?
It's likely a syntax error of some kind.
Thanks,
Steve
Hi @caterina ,
It looks like there was an error processing the template (i.e. what's on the customize webhook tab). Can you share that, and maybe we can spot it?
It's likely a syntax error of some kind.
Thanks,
Steve
Hi @pbspec2_5732 ,
The script in the linked gist should fix the problem for you; it's not feasible/possible to try editing in the database directly due to the complexity of the model.
https://gist.github.com/apxltd/351d328023c1c32852c30c335952fabb
Thanks,
Steve
Hi @r-vanmeurs_4680 ,
This is a false positive and you can disregard it; ProGet 5.2 is not impacted by the vulnerability in JQuery for may reasons, including the fact that they vulnerable code is not used and, if it were, ProGet is protected on the server-side from such "attacks".
Thanks,
Steve
What are you specifying for Distribution in the connector? When I tried these settings, it worked:
You need to specify one of the available distributions, which are listed here:
http://archive.ubuntu.com/ubuntu/dists/
Thanks,
steve
Hi @caterina,
We haven't thought of adding a --stage
option to the pgutil builds scan
command, but of course it's possible. I know that we are exploring other options like associating builds/projects with pipelines (i.e. a sequence of stages).
So perhaps that would mean, pgutil builds scan --workflow=MyWorkflow
, and then MyWorkflow
would start/end in different stages.
For now, we'd love to see how you utilize the stages. Maybe it means calling two commands temporarily, and we can revisit once we add new features, etc.?
For the pgutil builds create
command, the project must already exist, or you'll get that error. You can create or update a project using pgutil builds projects create
.
So basically, these command would probably do what you want?
pgutil builds projects create --project=BuildStageTest
pgutil builds create --build=1.0.0 --project=BuildStageTest --stage=DesiredStage
Note that when using pgutil builds create
, you can also specify a stage name, like in the example above.
Hi @arunkrishnasaamy-balasundaram_4161 ,
Thanks for clarifying!
[1] The MavenIndex file is not at all required to download artifacts from a remote Maven Repository nor to see the latest version of artifacts. In ProGet, all this allows you to do is browse remote artifact files in the ProGet UI which typically isn't very helpful.
[2] It's not possible to change this
[3] ProGet does not have "group repositories", but uses feeds with connectors. The model is different, and feeds with connectors will often cache packages
in a lot of organizations.
[4] It's likely you will be unsuccessful in your ProGet configuration with a setup like this or at least give your users a big headache and lots of pain/confusion. This is considered an "old mindset" the for configuring artifact repositories that were based on "files and folders on a share drive" not packages.
This "closed approach" will greatly slows down development, causes duplicate code, and lots of other problems. Modern development approaches do not use this level of highly-granular permission. Instead, they take a innersource model. You do not need to make everything available to everyone.
However, less than 1% of your 2k projects will contain sensitive IP or data that other teams can't access - those projects should be segregated into sensitive feeds. The logic is, "if everything is sensitive, then nothing is sensitive"
[5] ProGet does not generate a "support zip file"; if we require additional information when supporting users we ask for the specific information
Hi @uvonceumern_6611 ,
Thanks for providing all of the additional information; based on what you shared, it looks like the file is actually being uploaded incorrectly... using a "multi-party / form upload encoding" instead of a basic PUT
of POST
of the body contents.
Please see the Upload Asset File documentation for more information.
Thanks,
Steve
Hi @arunkrishnasaamy-balasundaram_4161,
I'll do my best to answer these!
You can configure the FullMavenConnectorIndex
job to run a routine basis under Admin > Scheduled tasks; it's not enabled by default because the MavenCentral Index is very large
I'm not sure what "Context path" means?
I'm not sure what you mean by "Group" repo?
The way to handle this in ProGet is by using Policies & Rules to define noncompliant artifacts; certain versions of log4j would be considered noncompliant because it has that severe vulnerability
We do not recommend 2000 feeds in any scenario; I wonder i there's a disconnect between what a "Feed" is and what you're looking for. A feed is place where you store all of the artifacts for a division/group of projects. The volume isn't a problem, but even a massive organization should have dozens of feeds at most
This should show on the "History" page
There is the "Packags" page at the top that can do some cross-feed searching
Yes; see https://docs.inedo.com/docs/proget/administration/retention-rules
I'm not sure what a support zip file is.
In general you would follow the migration guides we've published; however, your existing artifact server may not be configured to allow importing. If you're running into issues, best to open a new thread on the forums and we can review/investigate
Hope this helps point you in the right direction,
Steve
Hi @husterk_2844 ,
I'm afraid we're at a loss here; no one else has reported any kind of errors like this, and I can't imagine what would even cause such a problem.
I suspect there is something off about your Docker Compose file? That seems to be the only thing different than the basic setup.
I would just try to re-follow the basic instructions we posted:
https://docs.inedo.com/docs/installation/linux/docker-guide
That's what we use to test, and lots of users install and upgrade without a problem.
Thanks,
Steve
Hi @forbzie22_0253,
There's no Windows event logged, but once the /heath
page is reachable, then the application is ready. If you're using SQL Server and IIS on the same box, then both of those must first load before ProGet can start.
Thanks,
Steve
@v-makkenze_6348 whoops, good catch - yes thank you :)
Thanks so much Valentijn!
Looks like this was a display bug, and the code on Licenses Overview was looking at UsedByPackage_Count
instead of UsedByBuilds_Count
. Easy fix, which will ship via PG-2774 in next maintenance release:
As an FYI, the package with GPL-2.0 is node-forge@1.3.1
in the VicreaNpmJs feed. Looking closer, that package is dual-licensed as BSD-3, so it's not really a problem.
That said, the Licenses Overview page predates Policies, and I don't think the "License Usage Issues" makes a lot of sense anymore. The old model (block/allow) was much simpler with a basic Allow/Block rule. However, Policies are quite a bit more complicated.
We're very open to ideas on what to do in its place, or if you have any suggestions on what could be improved in general in the SCA UI. It's very easy for us to "see" what you're talking about, since we have the backup now :)
Thanks,
Steve
Hi @russell_8876,
ProGet itself does not have an upload limitation, so it's likely something else like your reverse proxy server, etc. Typically such large requests are blocked/prevented by middleware.
That said, HTTP is not a reliable protocol and should never be used for such large requests. You will run into a lot of problems trying to upload 24GB files in a single request. You'll need a new approach.
The easiest solution is to use drop paths, and then a file transfer protocol that is designed for large/reliable file transfers (most are).
Another solution is to use "upload chunking", which only asset directories support; pgutil
should handle the chunking/uploading for you. If you want to use packages, you can can then import that uploaded asset into a universal package feed:
https://docs.inedo.com/docs/proget/upack/proget-api-universalfeed/proget-api-universalfeed-import
-- Dean
Hi @v-makkenze_6348,
In theory, you should be able to find the noncompliant build on the Projects > Builds page, then narrow it down from there. But if you have a lot that are noncompliant, this may be difficult.
You've sent us your database in the past; would you mind uploading a BAK again? We can take a look and improve the UX so this will be discoverable. You can use an old link that we sent you a while ago, or fill out a support ticket and we'll get you a new link.
Let us know if you upload it - and we'll take a look!
Thanks,
Steve
Hi @jw,
I'm afraid I'm at at a loss then...
It seems to be related to one of the following settings / issues:
Based on the stack trace, I can't pinpoint which one it is, but I'll share the code - and maybe you'll see something.
I would first try to pinpoint which it is by trying as Admin vs Non-Admin, and then see which of those issues it could be.
private static List<Notification> GetNotificationsInternal(AhHttpContext context)
{
var notifications = new List<Notification>();
if(!string.IsNullOrWhiteSpace(ProGetConfig.Web.AdminBannerMessage) && (ProGetConfig.Web.AdminBannerExpiry == null || DateTime.UtcNow < ProGetConfig.Web.AdminBannerExpiry))
{
notifications.Add(new Notification(
NotificationType.warning,
ProGetConfig.Web.AdminBannerMessage
));
}
if (WebUserContext.IsAuthorizedForTask(ProGetSecuredTask.Admin_ConfigureProGet))
{
if (ShowUpdateNotification(context))
{
notifications.Add(new Notification(
NotificationType.update,
InfoBlock.Success(
new A(Localization.Global.UpdatesAvailable) { Href = UpdatesOverviewPage.BuildUrl(returnUrl: AhHttpContext.Current.Request.Url.PathAndQuery) }
)
));
}
if (ShowLicenseViolationNotification(context, out var violationUrl))
{
notifications.Add(new Notification(NotificationType.error,
new A(Localization.Global.LicenseViolation)
{ Href = violationUrl }
));
}
var expiresDate = Licensing.LicensingInformation.Current.LicenseKey?.ExpiresDate;
if (expiresDate != null && expiresDate <= DateTime.Now)
{
notifications.Add(new Notification(NotificationType.error,
new A(Localization.Global.KeyExpired)
{ Href = LicensingOverviewPage.BuildUrl() }
));
}
else if (Licensing.LicensingInformation.Current?.LicenseKey?.LicenseType == ProGetLicenseType.Trial)
{
var days = (int)expiresDate.Value.Subtract(DateTime.Now).TotalDays;
notifications.Add(new Notification(NotificationType.warning,
new A(Localization.Global.TrialWillExpire(ProGetConfig.Licensing.EnterpriseTrial ? "Enterprise" : "Basic", days))
{ Href = LicensingOverviewPage.BuildUrl() }
));
}
else if (expiresDate != null && expiresDate.Value.Subtract(DateTime.Now).TotalDays <= 45)
{
var days = (int)expiresDate.Value.Subtract(DateTime.Now).TotalDays;
notifications.Add(new Notification(NotificationType.warning,
new A(Localization.Global.KeyWillExpire(days))
{ Href = LicensingOverviewPage.BuildUrl() }
));
}
var extensions = ExtensionsManager.GetExtensions();
if (!extensions.Any() || extensions.All(e => !e.LoadResult.Loaded))
{
notifications.Add(new Notification(
NotificationType.error,
InfoBlock.Error(
new A(Localization.Global.ExtensionLoadError) { Href = Pages.Administration.Extensions.ExtensionsOverviewPage.BuildUrl() }
)
));
}
if (WUtil.ShowDockerRestartMessage)
{
notifications.Add(
new Notification(
NotificationType.warning,
InfoBlock.Warning(Localization.Global.ContainerRestartNeeded)
)
);
}
}
var v2Notifications = ShowV2DeprecatedQueriesUsedWarning(context);
if (v2Notifications.Any())
{
notifications.AddRange(v2Notifications.Select(f => new Notification(
NotificationType.warning,
InfoBlock.Warning(new A(ManageFeedPropertiesPage.BuildUrl(f.Feed_Id), $"{f.Feed_Name} is using deprecated ODATA (V2) Queries."))
)));
}
return notifications;
}
Hi @jw ,
Can you try restarting the web application (Admin > Manage Service)? Hopefully it will be resolved after that.
Thanks,
Steve
Hi @husterk_2844 ,
Those are some strange errors and I haven't seen them before. It seems that something is wrong with SQL Server.
What's interesting/notable here is that SQL Server is saying Incorrect syntax near 'GO'
on a the 20 CREATE TYPES.sql
script. That hasn't changed in 10+ years... and it's also unlikely that the latest SQL Server 2022 is failing to parse/execute that script, but I can't think of anything else.
The error occurring on 10 SET DATABASE PROPERTIES.sql
is also peculiar; that script has a TRY/CATCH, so it's not considered failure -- but the expected error is definitely not "ALTER DATABASE statement not allowed within multi-statement transaction."
From here, I would "play around" with different versions of SQL Server and ProGet. We've never seen anything like this, so don't really know where to start.
As for ProGet failing to start... a database failure would definitely yield that behavior, so that's not really an issue. The question really is why these database errors are occurring.
Thanks,
Steve
Hi @506576828_9736 ,
We don't have any templates or wizards for Vue project, but you can follow the guidance on Creating a Build Script from Scratch, which would involve the following steps:
Once you capture the artifact, you can use or customize the Deploy Artifacts to File Share Script Template
Please share your experience, as it'll help future users searching for this as well :)
Thanks,
Steve
Thanks @johnsen_7555! If I can offer some advice....
The workflow you're creating is a bit complicated, and adding in the automation component is "yet another product/process" to own/maintain. On our end, we get support inquiries from confused new administrators who notice "undocumented" behavior (i.e. not on docs.inedo.com) in ProGet.
If you're not "worried" about malicious packages, then the main risks you are mitigating are:
Both licenses and vulnerabilities are only a problem if they go to production, and keep in mind that vulnerabilities need to be monitored after a package is being used, since they are often discovered long after the package is used in your production software.
How about something like this:
noncompliant
packagesnoncompliant
noncompliant
warn
pgutil
in your CI/CD pipeline to prevent unaddressed warn
from going to productionwarn
packages as you have timeNote that, in a future version of ProGet, we intend to add more intelligence to package analysis for OSS packages. For example, we would like to say "this nunnpy
package has 1 version, is recently published, has no GitHub repo, etc., and therefore is noncompliant".
Hi @johnsen_7555,
I'm not really sure I totally understand the automated worfklow you want to create; you mentioned earlier having an approval process?
Are there any gaps with the workflow I mentioned? Basically two feeds (approved, unapproved), which you then use package promotion as the approval action.
We don't recommend using webhooks to automate ProGet itself. This can create some loops that will cause headaches.
Thanks,
Steve
Hi @johnsen_7555 ,
Ah ha, thanks for clarifying that!
This is the expected behavior, and the reason is a bit complex.
Unlike most package repositories, the PyPI Repository API (which a ProGet feed implements) does not provide any licensing information about packages. It's just a very basic listing of names and versions, which means that there is no license information (or description, author, etc). All of that is embedded in the package files.
However, pypi.org has a special API that ProGet queries to provide more information about a package hosted on pypi.org. This way, description and license information can be displayed on remote packages. But this API is only for pypi.org, and the pip
client doesn't use it.
When you connect to another feed in ProGet, the regular API is used. And since the PyPi Repository API doesn't provide package metadata, this information isn't available. It's on our long-term roadmap to use a special API / method for ProGet->ProGet connections, but that's a ways off and requires a lot of internal refactoring.
That said, the workflow we support to accomplish what you want is as follows:
https://blog.inedo.com/python/pypi-approval-workflow/
Thanks,
Steve
Thanks for clarifying @johnsen_7555.
I'm struggling a bit to see what kind of configuration might cause this issue or reproduce the issue. Is your python-accessible
feed connected directly to PyPi.org?
If you go to re-analyze the package, you should get a really long set of debug logs (no need to send them). But after you do that, can you try the download again?
Hi @johnsen_7555 ,
Sounds like you're building a sort of Python Package Approval Workflow, which is great to see.
If the user doesn't have permission to download the file, I would expect a 401
(if anonymous) or 403
if authenticated.
A 400
error is a bad request. It could be coming from ProGet, as ProGet will occasionally throw that message when there is unexpected input. But it could also be coming from an intermediate server that's processing the request before forwarding to ProGet.
In this case, I believe pip
is simply just performing a GET
on the URL in the error message:
.../download/numpy/2.0.0/numpy-2.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=6d7696c615765091cc5093f76fd1fa069870304beaccfd58b5dcc69e55ef49c1
I'm not 100% sure that's what pip
is doing, but why don't doing a curl -v
against that URL and see if you also get a 400
?
If so, then you should get an error in the message body from curl
. ProGet will write this out to the response stream.
If not, then you'll need to capture the traffic and see what the difference is. Maybe it's a header that's different? I'm not sure what would cause ProGet to yield a 400
.
Let us know what you find,
Steve
Hi @scott-wright_8356 ,
Once connector caching is enabled, the error pattern is not used, so we only have this warning. I added a small change via PG-2726 which will add the connector name. This will appear in the next maintenance release (2024.9), scheduled for this week.
Removing connector caching should reveal the connector name, so maybe that helps you identify it until then
Thanks,
Steve
@jw thanks for clarifying!
The pipelines in ProGet are not really meant to track status. The main reason for the build stages is to control automatic archival, issue creation, and notifications. For example, your stable releases might stay in a "Released" stage indefinitely.
We also had (in the preview feature) threre build statues: Active, Archived, Released. We don't use Released currently, but definitely something we may bring back. You don't want to delete a Released build, but you would probably want to delete an Archived build.
Anyway, I'd check out the pgutil builds promote
command - that way you can keep your "archival rules" in ProGet.
Hi @jw,
We'd love to learn more - why not?
We envisioned that there would be lots and lots of builds in the Build
stage (i.e. created by a CI server), and the ones released might got to a stage like Production
.
Thanks
@jw this was not addressed; the "build pipelines" are fairly primitive ProGet 2024, and we plan to review it as a whole as we get more feedback from users
If you click on "all builds" that might be the view you are expecting
Hi @jw ,
We'll address that via PG-2718 by displaying a message on the Bulk Edit/Promote Pages if the user lacks permission to delete or promote the selected packages.
Thanks,
Steve
Hi @fabrice-mejean_6049 , @brett-polivka
Thanks for the feedback!
With ProGet 2023 out of the way (which made developing new feeds a lot simpler), and the addition of the Cargo Registry Web API, this is something we're much more open to implementing.
We want a user partner, someone who is already using Rust/Cargo and would be able to work with testing/developing this for us. Currently there hasn't been any market demand from prospects or other users.... and for your team, seems like a "nice to have" more than anything.
Thanks,
Steve
Great!
I don't believe that there is any public reporting of VISX package vulnerabilities, so this is not data that would be available in ProGet either
Hi @arkady-karasin_6391 ,
Thanks for sharing the error; basically this error means that you're connecting to an invalid feed/URL. You can only connect to a VISX feed, which is an ATOM-based format.
Note: You cannot mirror the "official" Visual Studio Gallery URL, as that is technically not a VISX gallery. It uses a different and undocumented format, and Visual Studio will not recognize that format for anything except official gallery.
Connectors in VISX aren't very useful, and really are only useful for connecting to other ProGet instances.
Thanks,
Steve
Hi @Scati,
It looks like there isn't a way to do this currently; we'll fix this via PG-2716 in an upcoming maintenance release. Maybe this week's, but more likely two weeks from now.
In ProGet 2024, debian
should create a "Debian" feed, and debianlegacy
will create a "Debian (Legacy) Feed. We'll also add a feedGroup
property.
Thanks,
Steve
Hi @daniels,
This is technically possible, but it's not something that we support out-of-the-box I'm afraid.
The closest way to accomplish something like this would be:
So it's not easy from the user perspective. However. if Otter and this approach looks like it will work for you, then it's something we can definitely explore together as a user/customer, and build out a use case / case study / etc. That'd be best to start a conversation w/ someone in our customer / sales team on that.
Otherwise, there really hasn't been enough demand for this particular use case, and it's hard enough to market Otter's other use cases, let alone develop build new ones ;)
FYI - note that you can also run the Inedo Agent in outgoing mode:
https://docs.inedo.com/docs/inedoagent-overview#communication-modes
Thanks,
Steve
Hi @arkady-karasin_6391 ,
Under Admin > Diagnostic Center, can you locate the error message and share the full stack trace?
Thanks,
Steve
@parthu-reddy but let us know if you'd like to use a patch/prerelease. We can ship it shortly after 2025.6 is released, and it'd be the only fix likely for the version
Hi @parthu-reddy,
That's generally our plan, and we'll fix it via PG-2702. It's currently scheduled for the 2025.7, on June 14.
Thanks,
Steve
Hi @parthu-reddy ,
It looks like the version field on Maven artifacts is limited to 50 characters, and 2.14.7-sbt-a1b0ffbb8f64bb820f4f84a0c07a0c0964507493
is 51 characters. This field has been limited since "day one" (very many years now), so it's unrelated to the upgrade.
The only workaround is to use a smaller version number (or download the artifact and re-add it without that long hash or something).
Unfortunately this isn't an easy change, as that is used in a primary key. We will have to research and let you know.
Thanks,
Steve
@chris-blyth_5143 this means that the database and application code our out of sync somehow. That's unusual and is often a result of restoring the database but not application code, doing manual installation, etc.
I would just uninstall everything, make sure all the components are gone (except the config file of course), then reinstall and point to the same database. The database poitns to the file share, so it should work upon installation again
In that case please just upgrade and it should be resolved :)
Hi @sebastian ,
This was fixed in ProGet 2024 via PG-2630 (FIX: Dual License Packages should show as compliant if one or more licenses are compliant). It was a bug in the implementation of policies, so it wouldn't work in ProGet 2023 either.
So this should get fixed once you upgrade :)
Thanks,
Steve
Hi @Darren-Gipson_6156, sounds like you've found the right tools to configure the integration.
Here's some more information about the Advanced settings:
https://docs.inedo.com/docs/various-ldap-v4-advanced
The DOMAIN\username
situation is a little complex. The DOMAIN
is considered a NetBios Alias, and needs to be mapped to a domain to search (like domain.com
). Then an LDAP query is constructed babsed on that. So in otherwords, you can't search directly for DOMAIN\username
in a search like that.
Try adding a Netbios Alias mapping in the advanced setings, like DOMAIN=domain.com
; that might allow you to log-in.
Hi @jw ,
FYI - We just wanted to clarify what "inconclusive" meant - this was a "late" change on our end, and we realized the documentation wasn't very clear. Here is how we describe it now:
Inconclusive Analysis
A build package (and thus a build as a whole) can be have an "inconclusive" compliance status. This will occur when two conditions are met:
- A rule would cause the build package to be Noncompliant, such as Undetected Licenses = Noncompliant or Deprecated = Noncompliant
- The package is not cached or otherwise pulled to ProGet, which means ProGet doesn't have enough information about the package to perform an analysis because the package is
You can resolve this by pulling or downloading (i.e. caching) the package in a feed in ProGet, or not defining rules that require server-based metadata. For example, vulnerability-based rules can be checked without the package, but deprecation or license detection cannot.
The analysis message is incorrect however, it should be "Package is Warn because of Package Status is unknown, No license detected."
Thanks,
Steve
Hi @sebastian ,
Thanks for all of the details, this is indeed a regression. We'll get this fixed via PG-2679 in teh upcoming maintenance release , ideally later this week (Friday).
Thanks,
Steve
@daniel-scati looks like there was a redirect problem, but this is the method to try:
https://docs.inedo.com/docs/proget-api-packages-query-latest
So basically this:
GET /api/packages/MyDebianFeed/latest?name=pacomalarmprocessor
ProGet dynamically generates these indexes based on an aggregation of locally stored packages and connector results on each request, so caching doesn't make a lot of sense.
npmjs.org, on the other hand, needs to only update indexes when a new version is uploaded, so the cache duration can be a long time.
Thanks,
Steve
Hi @pbinnell_2355 ,
It looks like you have Windows Integrated Authentication enabled. Curl does not support this, but with powerShell you would need to add -UseDefaultCredentials
Thanks,
Steve
Hi @jw ,
I haven't investigated this yet, but I assume that the results are the same in the UI? That's all just pulling data from the database, so I would presume so.
Could you find the relavent parts of the analysis logs? That helps us debug much easier.
Thanks,
Steve
ProGet does not set cache headers for npm requests, so this behavior is expected.
Thanks,
Steve
@jw thanks for clarifying! We'll get the error fixed, but these would not show up in the export, since they are not a build package then