Hi @rhessinger,
sorry for the late response, this somehow slipped my mind.
Your suggestion looks good to me. The description is easy to understand and also the parameter name is very descriptive.
Thanks,
Caterina
Hi @rhessinger,
sorry for the late response, this somehow slipped my mind.
Your suggestion looks good to me. The description is easy to understand and also the parameter name is very descriptive.
Thanks,
Caterina
Hi @atripp,
I was able to test the new version of ProGet and there still seems to be a weird behavior.
Again I am using my test-project 'BuildStageTest'. I deleted all existing builds to give it a "fresh start".
Creating a build via SBOM upload seems to be working fine. This way I created build 1.0.0:
As you can see, there are no other builds in my project.
If I try to create the same version via "Create Build" you display an error (which is great):
But if I try to manually create build 2.0.0 I get the same error even if the build is not existing:
I thought that there might be an issue because build 2.0.0 already existed in previous tests. But I deleted all builds as you can see from the ui and the database screenshots above.
So I also created a completely new project and tried to manually create builds. But for some versions i do get the error message (I was not able to manually create 1.0.0, 2.0.0, 4.0.0, but some other versions were working just fine):
Are you able to reproduce this behavior?
Thanks,
Caterina
Hi @atripp,
I think uploading a SBOM should not modify other metadata of a build (e.g. Build Stage, BuildStatus). I also can't see a usecase where it would be necessary to updoad a SBOM for an inactive build. Even if you do not prevent it, it should not change the BuildStatus I guess .
For the duplicates: those are indeed two different builds. I can also see it in the database:
The difference here is Release_Number. If a build gets created via SBOM the Release_Number is NULL. If I use "Create Build" the Release_Number is empty. Both Build_Number are the same. None of it containes a whitespace.
Best,
Caterina
Hi @atripp,
to be honest, I am still confused. Let me explain my approach in more detail:
So I created a test project which does not contain any builds yet:
I want to add build 1.0.0 to this project. To do so, I upload an SBOM file containing the information about version 1.0.0. I also promote this version to the stage "Release":
Now the worst case scenario happens. After months someone uploads a different sbom with version 1.0.0 to this project. I can confirm that the information from the "new" sbom gets added. But the build is also moved back to the stage "Build":
I would expect that the build remains in its stage. But maybe you have a valid reason to do it like this?
So now I used "Create Build" from the dropdown to manually create version 1.0.0 again, which leads to two versions 1.0.0:
If I now upload a SBOM for version 1.0.0 only the first version 1.0.0 gets modified. The second, manually created version 1.0.0 remains unchanged.
Further, I created a version 2.0.0 manually using "Create Build". Then I uploaded a SBOM for version 2.0.0, which again results in two builds with version 2.0.0:
As far as I understood your explanation, uploading a SBOM should have added the information to the already existing build 2.0.0 instead of creating a new build?
You also said that the constraint for duplicates is <Project_Id, Release_Number, Build_Number>
.
So for versions 1.0.0 the Project_Id as well as the Build_Number are the same. The Release_Number is empty for both builds (or not set):
Maybe that's a problem leading to having a build twice? But this seems to be the default when using pgutil to create and upload a SBOM. I am not really sure how this Release_Number property should be used and how it is set using pgutil.
But if this behavior is on purpose maybe you can again explain to me why?
Thanks
Caterina
Hi @atripp,
thank you for the clarification. I can confirm this behavior.
Thank you very much,
Caterina
Edit:
If I first use "Create Build" to create build 3.0.0 and afterwards "Import SBOM" to create build 3.0.0 I have two builds with version 3.0.0 in my project.
The same happens if I do it vice versa.
Hi,
there are two possibilities to create a build in a project in ProGet 2024.
I can select between "Create Build" and "Import SBOM".
I was interested in what would happen if I create a build with the same version twice.
Both of those options create a new build but with a different behavior.
If I select "Create Build" to create e.g. build 1.0.0 and I do this a second time I get an error:
If I select "Import SBOM" to create build 2.0.0 and I do this a second time, the first build gets deleted and overwritten I think. Because if build 2.0.0 is in another stage and I try to import the SBOM to my default stage, the build is available in the default stage and no longer in the other stage.
Shouldn't these two options behave the same?
If we use pgutil to import a SBOM we would prefer that previous builds would not be overwritten. It should no happen that a build with the same version gets created twice, but sometimes mistakes happen. In this case the previous build should not be deleted without a warning.
Can you please have a look into this szenario? Maybe we can also discuss what would be the best behavior for pgutil in this case.
Thanks
Caterina
Hi,
while comparing ProGet 2023 with ProGet 2024 I noticed that a project has an issue in ProGet 2024 but not in 2023.
In ProGet 2024 the project shows a warning for the package "DevExpress.Win.Gauges 21.2.7" (this package comes from a proxy-feed). The warning is "because of package status (unlisted, deprecated) is unknown, no license detected".
But if I have a look at the package the license is being detected and it has no vulnerabilities or other issues.
In ProGet 2023 the same project referencing the same package does not have an issue.
I hope I could make the problem clear, unfortunately the server won't let me upload screenshots for some reason.
Thanks,
Caterina
Hi @atripp,
the default behavior of the NpmDependencyScanner is to read the input file as well as all package-lock.json files found in the node_modules directory.
Rich and I had a longer discussion about this behavior last year (https://forums.inedo.com/topic/3934/pgscan-different-results-for-npm-dependencies/13).
Result of this discussion was to add "--package-lock-only" to be able to ignore the package-lock.json files in node_modules.
The code for this is already part of pgutil ('packageLockOnly'-property in NpmDependencyScanner). The only thing missing here is the possibility to set this property from the outside.
Thanks,
Caterina
Hi @atripp,
this feature is necessary for us because we have products where we are not able to access the version otherwise. But we want to have the according information about this project, so we implemented the possibility to read the version from a .dll or .exe.
It just extends the current functionality and I think it would also be a great addition for pgutil.
It is not limited to .NET since we were using System.Diagnostics.FileVersionInfo.GetVersionInfo()
to retrieve the information in pgscan.
I think instead of 'identify' it is now 'builds scan'. It is a property which is necessary for sbom generation. So I would suggest that 'builds scan' and 'builds sbom' in pgutil should support this option.
Thanks,
Caterina
Hi,
I see that the Dependency Scanner for npm projects is able to handle "packageLockOnly" (like it was the case in pgscan). But I can't seem to find an option to set this flag from the outside.
Maybe this option has been overlooked? Or I am missing it?
I appreciate the help.
Thanks,
Caterina
Hi,
in pgscan it was possible to read the product name and version from a given file using --fileInfo (for 'identify) or --consumer-package-file (for 'publish').
I think this is missing in pgutil? Or am I missing something here?
It looks like the SbomCommand and the ScanCommand can only handle the properties --project-name and --version. But I can't see an option for a file.
Hopefully you can help me with this issue.
Thanks,
Caterina
Hi,
in pgscan we introduced the possibility to write an SBOM to a local output file instead of uploading it to the server using the flag '--output-file'.
Is this still possible in pgutil? I don't seem to find an option to do this.
It was always really helpful for testing purposes.
Thanks,
Caterina
Hi @rhessinger,
thank you very much! We will just wait for the next release and I will come back to you if we encounter further problems.
Thanks,
Caterina
Hi @Dan_Woolf,
thank you for the quick fix.
But the promotion call does not seem to promote the build. The build remains in its initial stage.
From pgutil I do get the feedback "Promoting BuildStageTest 1.1.0 to Test/Production/Integration". I tested it with all of our stages. But if I have a look at ProGet the build is still in the initial stage "Build".
Can you reproduce this behavior as well?
Thanks,
Caterina
Hi @stevedennis,
thank you for your response.
Maybe you can also help me with a follow-up-problem.
I created a project using
pgutil builds projects create --project=BuildStageTest
and i created a build using
pgutil builds scan --input=<path-to-sln> --project-name=BuildStageTest --version=1.1.0 --api-key=***
Those steps worked perfectly fine:
Afterwards I wanted to promote this build to another stage:
pgutil builds promote --build=1.1.0 --project=BuildStageTest --stage=Test --api-key=***
but I got an error:
Server responded with BadRequest (400): Missing required query argument: build
I really hope you can help me here as well.
Thanks,
Caterina
Hi,
I have questions regarding the usage of pgutil.
Our previous approach using pgscan was to upload an sbom which contained all necessary information. The project was automatically created.
As far as I understand, uploading an sbom with pgutil (pgutil.exe builds scan) also creates the project without explicitly creating one (pgutil.exe builds create).
Using the promote command (pgutil.exe builds promote) a build can be promoted to a different stage.
Is it possible to combine this functionality? We would love to upload an sbom directly to a specific stage. If it is not possible - have you thought about adding this functionality? Or will we have to call both commands consecutively?
Further, I have tried to create a build using pgutil but I keep getting an error.
My pgutil call looks like this:
pgutil.exe builds create --build=1.0.0 --project=BuildStageTest --api-key=*** --source=Test
And the error I get is:
Server responded with BadRequest (400): Invalid project name.
If I try to create this project directly in ProGet there are no errors. Can you reproduce this problem? Or is something wrong with my call? I am working with the latest sources from github.
Thank you
Caterina
Hi @atripp,
there are no connector errors listed in the diagnostic center and the connector itself also says there were no recent errors.
But we noticed another thing:
We have a package filter for this connector because we do not want to get DevExpress packages from this feed.
If we delete this filter, the connector packages are displayed in the feed.
As you can see it is our only filter. If we add the filter again, the connector packages are gone.
Further, we noticed this behavior already in version 2024.13. In 2024.12 everything was working fine.
Thanks,
Caterina
Hi,
we have a NuGet feed which is using a connector to NuGet ( https://api.nuget.org/v3/index.json).
Having a look at all versions of a package we are able to see local packages as well as connector packages:
We noticed that starting with version 2024.13 the packages with NuGet Connector as source were no longer showing up:
The connector itself still seems to be working. Using Visual Studio I installed a version with a local source. Afterwards I manually changed the version in my .csproj file to a version with NuGet Connector source which is not visible in my feed. Visual studio was able to successfully restore this nuget package from my feed, but the package never showed up with a local source. This creates the impression that the connector is still working but may not be displayed correctly.
Further, this is only an issue with our feed using the NuGet connector. We have several feeds using a connector (e.g. https://registry.npmjs.org) but they are showing local and connector packages.
Maybe someone can help me with this.
Tanks,
Caterina
Hi all,
I wanted to downgrade my ProGet installation from 2024 to 2023. The documentation for ProGet 2024 says: "However, if you need to rollback to ProGet 2023, you can do so without restoring the database by simply using the Inedo Hub."
But I can not see such an option in the Inedo Hub. All i can see is an upgrade to the different versions of 2024. Maybe someone can help me with this.
Tanks,
Caterina
Hi @gdivis,
thank you very much for your response!
We are really looking forward to the finalized version :)
Thanks,
Caterina
Hi @atripp,
thank you very much for clarifying this.
We are not able to update to ProGet 2024 yet, so I will just resolve this issue manually for our projects.
Thanks,
Caterina
Hi,
we came accross an issue in some of our projects which does not really make sense to us:
If I have a look at "@types/http-proxy 1.17.14" it does not show any vulnerability:
If I have a look at the vulnerability itself, it says that this vulnerability affects package "http-proxy <1.18.1":
Could it be possible that the scope of the package is not considered in vulnerability scanning, and that "@types/http-proxy 1.17.14" leads to "http-proxy 1.17.14"?
Or is there any other reason for this vulnerability to show up in our projects?
I hope you can clarify this issue for me.
Best,
Caterina
Hi,
I was having a look at pgutil for the first time, and I appreciate that you brought the functionality to consider project references as package references into it (we introduced this feature in pgscan).
Maybe I am overlooking something, but I can not find an option to set the flag. I guess it should be settable for sbom and scan?
Please let me know if I am missing something here.
Thanks,
Caterina
Hi @gdivis,
thank you for your response.
I think it is a little bit confusing in general.
Just for clarification:
We are not using any symbol packages at all (neither .symbols.nupgk nor .snupkg). We are only embedding PDBs right into our .nupgk files. But it would be nice to strip the PDBs on package download. We can achieve this on ProGet by activating the symbol server for the legacy format.
I was just confused on why I have to use "legacy" for this approach since it is also standard to embedd the PDBs instead of using symbol packages.
I guess normally it won't be necessary to enable a symbol server if you are using embedded PDBs. But we want to see the symbol information for our packages and we want to be able to strip the PDBs on download.
It seems to be more of a hack to use the legacy symbol server for this but we achieve our goals doing so. No need to file a bug for this.
Thank you,
Caterina
Hi,
I have to ask again:
https://learn.microsoft.com/en-us/dotnet/standard/library-guidance/nuget#symbol-packages
This documentation states that it is possible to use .snupkg files or to embed the symbol files in the NuGet package (like we do it).
I can't find a documentation that says embedding symbol files is deprecated/legacy.
The documentation says: "The downside of embedding symbol files is that they increase the package size by about 30% for .NET libraries compiled using SDK-style projects. If package size is a concern, you should publish symbols in a symbol package instead."
But with the setting "Strip symbol files and source code from packages downloaded from this feed" ProGet provides for the legacy format we can avoid the problem of an increased package size.
Maybe you can evaluate further why only .snupgk files are supported as a standard?
Thanks,
Caterina
Hi @atripp,
our pull request contains a fix for the assignment of the scope to the group property of the dependency scanner.
But as already mentioned in my previous post this only fixes the appearance of the packages in the overview. The URL behind the package name is still broken.
Maybe this needs to be fixed on the ProGet-side?
Cheers,
Caterina
Hi @atripp,
ah I see the settings were messed up. "Standard" was selected. Switching to "Mixed" made the download-button appear again.
The info window of the settings states:
"We recommend following the "Standard" approach of storing NuGet Packages and Symbol Packages (.snupkg format) in the same feed".
Would it be better to work with .snupkg files instead of embedded portable PDBs?
Cheers,
Caterina
Hi,
we are using portable PDB files in combination with Source Link for debugging purposes.
Last time I checked, I was able to download either the package itself, or the package with symbols. But I can not find the option "Download with Symbols" anymore.
Has it moved? Or is it no longer possible?
I am asking because we have a problem with debugging a specific package and I would like to download it (including its symbols) to check if the package is ok.
Cheers,
Caterina
Hi @atripp,
I investigated a little bit further:
The NpmDependencyScanner does not treat the scope of a package as a 'Group'.
I added a little code for testing:
Afterwards the packages-list of my release looks like this:
But the url behind a scoped package is still not working:
Cheers,
Caterina
Hi @atripp,
thank you for the quick fix. We would have suggested just the same solution.
I have tried the version and scoped packages are now visible but I ended up with the following:
Cheers,
Caterina
Hi @atripp,
thanks for having a look at it.
Yes, we are using pgscan to create and upload our SBOMs.
Cheers,
Caterina
Hi @atripp,
I just created an angular test project with default dependencies and created the sbom.
I sent the sbom file via email to support@inedo.com.
If I upload this file to our ProGet server (using our npm-proxy-feed) I can see all packages without a scope, and the project is listed under usages. Packages with a scope are missing.
Cheers,
Caterina
Hello,
I am using pgscan to create and upload sbom files for our releases.
I noticed that our sbom files contain scoped npm packages (e.g. @angular/common, @babel/core) which is correct because we are using them.
But if I have a look at the packages of a specific release using scoped npm packages, those are not listed (Reporting & SCA -> Releases -> Specific Release -> Packages).
Vice versa, my release is not listed under "Usage & Statistic" of a used scoped npm package.
Has this behavior already been noticed?
Cheers,
Caterina
Hi Rich,
please take your time.
We used to have two different pgscan calls for nuget and npm and we ended up with two files on ProGet:
But did I get it right that if I export the sbom those two files are being merged into one? In this case we would have to think about separating those pgscan calls again.
Thank you
Caterina
Hi Rich,
to get back to my "initial problem".
If I would use pgscan with auto
type I would run into the same problem. Because the dev dependencies within the package-lock.json would be ommited but the node_modules directory contains also dev dependencies and their package-lock.json files would be read as well leading to my initial problem (having dev dependencies in the sbom file). I think we are not able to distinguish between dev dependency and "real" dependency within the node_modules folder.
Of course I could explicitly specify only to scan the package-lock.json file with the npm
type but I would have to make a second pgscan call for nuget packages and would end up with two sbom files. It is a lot more comfortable to have all dependencies in one sbom file.
Further, pgscan with auto
type and pgscan with npm
type would by default list different npm dependencies.
Or did I understand something wrong?
Thanks
Caterina
Hi Rich,
we talked about this options as well and we think that maybe a switch to exclude dev dependencies could be helpful.
If pgscan takes the argument --exclude-dev (e.g.) node_modules folders are ignored and only dependencies with "dev: false" in the package-lock.json file are written into the sbom file.
Otherwise all dependencies in package-lock.json are listed and node_modules folders are included.
I can't imagine a scenario where I want "package-lock-only" without "omit:dev". Because dev dependencies would be listed in the sbom but the node_modules would not be scanned which would lead to an incomplete output I guess.
Let me know how you think about it and what the thoughts of your team are.
Thanks
Caterina
Hi Rich,
I had to dive deeper into this topic to have a better understandig of it.
So basically what we are doing is calling "npm ci" followed by "ng build --configuration production".
The package-lock.json file contains all dependencies, production and dev dependencies. "npm ci" installs them all which means that the node_modules folder contains dev dependencies as well. But "ng build --configuration production" creates our production output which has no dev references.
I tried to call "npm ci --omit=dev". In this case only production dependencies are installed and part of node_modules. But unfortunately, "@angular/cli" is a dev dependency which is needed to call "ng build".
Therefore, I would say that the node_modules scan should still be removed since this folder could contain dev dependencies which are not part of the final product.
Further, I guess we should add a filter for the dev dependencies while parsing the package-lock.json in pgscan. Right now dev dependencies are part of the generated sbom file. But packages like "@angular/cli" for example are never being shipped with our product.
Hope it gets clear what I am trying to say.
Thanks
Caterina
Hi Rich,
thank you, I was looking for this conversation!
So our current problem is that ProGet determines a critical vulnerability in one of our projects:
Our affected project team contacted me and told me that the project is not referencing json-schema in any version. And this dependency is also not listed in the package-lock.json of the project.
But I noticed that our project references "minipass-sized" and "npm-normalize-package-bin" as developer dependencies. Both of them have "json-schema" as dependency in their package-lock.json files.
And I guess I just fixed my own problem here... DevDependencies should not be part of my production output and thus not part of my node_modules folder. I guess I will have to take a look at the buildprocess of the product again.
I will keep you updated.
Thanks
Caterina
Hi,
I noticed that the list of npm dependencies differs depending on which type is given.
If the input is a package-lock.json file with type "npm" only the dependencies of this file are being processed.
If the input is a .sln with no type, pgscan scans for nuget and npm dependencies. But for npm dependencies all package-lock.json files are being processed, also the ones under "node_modules". This results in different npm dependencies.
I don't think that package-lock.json files of node_modules should be processed since all necessary dependencies are part of my projects package-lock.json file.
Further, I would like to call pgscan on a .sln because all project dependencies (nuget and npm) are listed in one sbom file.
Anyone else have issues with this procedure? Or is there a valid reason why package-lock.json files of node_modules should be processed as well? I just think both pgscan calls should result in the same npm dependencies.
Thank you,
Caterina
Hi @Dan_Woolf,
so it seems to be a long known issue: https://github.com/NuGet/Home/issues/3116
Thank you for your help.
Caterina
Hi,
we noticed a delay between uploading a package and being able to do a nuget restore on a consumer of this package.
More specific:
We uploaded a new package "DAP.Common 7.0.13" to ProGet. Afterwards, we updated the dependency to this new version manually in a consuming project. We tried to call nuget restore but it failed with the following error: "Unable to find package DAP.Common with version (>= 7.0.13)". There were about 15 minutes between package upload and nuget restore. A few minutes later we tried it again and the package was found. But the package could be seen on ProGet right after the uplaod.
We were able to observe this behavior with several packages. If the dependency got updated manually in a consuming project, nuget restore failed for a certain time period.
Unfortunately, we can not say if the problem is still present if the nuget package manager in Visual Studio would have been used to update the package dependency.
Have you already seen such a behavior?
Hi @atripp
I think packages
has to be iterated instead. I haven't seen dependencies
beneath packages
.
Further, the "empty" Key has to be ignored as it stands for the root project:
Maybe a little bit parsing would be necessary.
In lockfileVersion 2 a dependency was listed like this:
In lockfileVersion 3 it looks like this:
If desired, we can also upload package-lock.json files for testing via MyInedo.
We noticed that pgscan does not list any dependencies for one of our npm-projects.
After debugging into it and comparing it with other npm-projects we noticed that there is a difference in the lockfileVersion of the package-lock.json files. The "problem-project" has lockfileVersion 3 while the others have lockfileVersion 2.
pgscan tries to read the dependencies from the property "dependencies" which is a legacy-property from lockfileVersion 1. lockfileVerson 2 was downward compatible, but lockfileVersion 3 (used by npm v9) is not. The newest package-lock.json no longer has the property "dependencies" and all dependencies are part of the "packages"-property.
Here is the official documentation about it: https://docs.npmjs.com/cli/v9/configuring-npm/package-lock-json/#lockfileversion
Have you already noticed this breaking change in the package-lock.json files?
I already opened an issue for this topic on ghithub:
https://github.com/Inedo/pgscan/issues/33
Hi,
we have the assessment type "Manually Unblocked" which we have assigned to some vulnerabilities a while ago. Now we noticed that only 5-6 vulnerabilities are having this assessment even though we assigned it to more than that. How is this possible? ( ProGet 2022.29)
Further, after upgrading to ProGet 2023 on a test server (which is an exact copy of our live system) 0 vulnerabilities had this assessment. Somehow this information got lost. Is this already a known problem? Or is this behavior intentional?
Thanks
Looking at the Usage&Statistic of a package, we can see the Latest Version of a consumer.
We noticed that the Latest Version is not the highest version of a consumer, but the latest version uploaded.
E.g.: We uploaded a project in version 2.0.0 and 2.0.0 is shown as Latest Version in its dependencies. Afterwards we had to upload a fix from a branch in version 1.6.1. Now 1.6.1 is shown as Latest Versions in the same dependencies.
One could assume that only versions below 1.6.1 are affected if a vulnerability is found in the package.
Could anybody else observe this behavior? Is it supposed to work this way?