Please see our documentation for cloud storage. There is a subsection for migrating a feed to cloud storage.
Thanks,
Rich
Please see our documentation for cloud storage. There is a subsection for migrating a feed to cloud storage.
Thanks,
Rich
Hi @gravufo,
Great! Glad to hear it! Please post back if you find anything else.
I also recommend that you switch to the Inedo Hub in the future. We are in the process of deprecating our traditional installer. The Inedo Hub has the ability to update an installation previously installed with the traditional installer and the Inedo Hub now supports offline installations as well, if you need that functionality.
Thanks,
Rich
Hi @gravufo,
Would you be able to rerun the database scripts on your database? You will just need to run the Run inedosql to update the database step of our manual install guide. Can you see if this fixes your issue?
Thanks,
Rich
We have finally been able to recreate this issue in our sandbox. We are currently looking into a fix, but we expect to have one in the next version of ProGet, 5.3.10. This looks to be an issue with the mono framework. We use mono runtime in our Docker images for ProGet. We are also going to be releasing a .Net Core based technical preview of ProGet Docker in version 5.3.10. This will be in addition to our standard mono based version. Our internal testing is going very well and it looks to have removed a lot of the gotchas that mono has.
Thanks,
Rich
Hi @mcascone ,
We have fixed the issue and the published date will now update when the package is overwritten. This will be released in ProGet 5.3.9 which is due out this Friday Augst 14, 2020.
Thanks,
Rich
Hi @viceice,
Thanks for the clarification on your environment! I see what is going on now. I have created a ticket, PG-1809, to track the fix for this. We expect this to be released in ProGet 5.3.11 which we are expecting to be released in September 11, 2020. Basically in that instance, we are not respecting the values within the X-Forwarded-* headers. I'll let you know if anything changes on the timeline.
Thanks,
Rich
Hi @markus4830,
I'm definitely sorry about this. The change was made to help to aide in improvements to other areas of the system related to NuGet. Unfortunately, it looks like it affected the NuGet API. Expect a more permanent solution in the near future.
Thanks,
Rich
Hi @viceice,
That error is safe to ignore. It is currently a known bug and we are looking to fix that in an upcoming version of ProGet. The ticket tracking the fix for the log message is PG-1841.
Long story short, the ProGet service correctly detected that the product wasn't activated, and then logged that message. But it was doing it every time it accessed license information, which is on every connector health check, replication run, etc.
Activation happens automatically as soon as someone visits the Web application, and re-activation is required after upgrading certain versions.
Thanks,
Rich
Hi @harald-somnes-hanssen_2204,
The fix is scheduled for release in ProGet 5.3.16 which is due out on Friday. I'll reply back if anything changes.
Thanks,
Rich
Hi @tkolb_7784,
If you are hosting on a windows machine, the easiest solution right now is to migrate your server to use IIS and then add an SSL binding to your site. If you do not want to purchase a new certificate and the self-signed certificate is too much work, you can use Let's Encrypt and configure it via winacme.
If you do not want to use IIS, then you will need to use a reverse proxy to handle SSL connections. Any reverse proxy can be used and a pretty simple one to configure is stunnel. Most reverse proxies can also be used with Let's Encrypt.
If you are hosting via Linux (Docker), then you will need to use a reverse proxy to handle SSL connections. We have a documentation page for different Linux-based reverse proxies including an example for setting up NGINX. These reverse proxies also support Let's Encrypt also.
Please let me know if you have any questions.
Thanks,
Rich
Starting in version 6.2.21, we actually made a new variable function to create the URL for you. You can use the $BuildMasterUrl(build)
function and that will generate the URL. If you are running that from within the context of a build, you will not need to specify any other parmaters. You can see more about its usage in the functions reference in BuildMaster by navigating to Administration -> Reference Documentation section -> Functions
. We have also added a $BuildMasterId
function if you need to get the BuildId directly.
Hope this helps!
Thanks,
Rich
Hi @toni-wenzel_6045,
Currently, the only way to set the description would be to do it using the UI or the Native API. In order to set it using the Native API, you would need to do the following:
First, Get the Docker Repository ID by making a GET
request to <Base URL>/api/json/DockerImages_GetRepositoryByName?key=<API Key>&Feed_Id=<Feed_ID>&Repository_Name=<Repository Name>
This will return a list of images (should only be 1 in the list) and you will need to get the DockerRepository_Id
from each item.
Then, you will need to POST
a JSON object to <Base URL>/api/json/DockerImages_CreateOrUpdateRepository
.
The JSON object you would post would be:
{
"API_Key" : "<API Key>",
"Feed_Id": "<Feed Id>",
"DockerRepository_Id": "<ID from previous request>",
"Repository_Name": "<Repository Name>",
"RepositoryIcon_Url": "<Icon_URL>",
"ShortDescription_Text": "<Short Description>",
"FullDescription_Text": "<Full description (Readme.md)>"
}
Please note that you will need to populate all of the values. For example, if you do not populate the RepositoryIcon_Url, the value will be cleared.
Thanks,
Rich
Currently, the only way to set the description would be to do it using the UI or the Native API. In order to set it using the Native API, you would need to do the following:
First, Get the Docker Repository ID by making an HTTP GET
request to <Base URL>/api/json/DockerImages_GetRepositoryByName?key=<API Key>&Feed_Id=<Feed_ID>&Repository_Name=<Repository Name>
This will return a list of images (should only be 1 in the list) and you will need to get the DockerRepository_Id
from each item.
Then, you will need to HTTP POST
a JSON object to <Base URL>/api/json/DockerImages_CreateOrUpdateRepository
.
The JSON object you would post would be:
{
"API_Key" : "<API Key>",
"Feed_Id": "<Feed Id>",
"DockerRepository_Id": "<ID from previous request>",
"Repository_Name": "<Repository Name>",
"RepositoryIcon_Url": "<Icon_URL>",
"ShortDescription_Text": "<Short Description>",
"FullDescription_Text": "<Full description (Readme.md)>"
}
Please note that you will need to populate all of the values. For example, if you do not populate the RepositoryIcon_Url, the value will be cleared.
Thanks,
Rich
Hi @kichikawa_2913,
Thanks for the information. I have updated our Docker Troubleshooting Guide to include details about root-less containers. We are also discussing internally if we want to change the default port to be > 1024 going forward.
Thanks,
Rich
Hi @Crimrose and @hyt5036_7430,
I just made some tweaks to our 1.9 packages and released the ProGet 5.3.16 compatible extensions as -RC versions. If you change your Extensions.UpdateFeedUrl
to https://proget.inedo.com/feeds/PrereleaseExtensions
you can install the latest RC version of these extensions and they will work with the proget
or progetmono
image. Since the proget
and progetmono
images include the exact same features, you can switch between them without loss of data.
Thanks,
Rich
Hi @kichikawa_2913,
Thanks for providing this additional information. Please give me a bit of time to review this and get back to you. I was able to recreate this issue on the ProGet image, but I need to find the root cause of this first.
Thanks,
Rich
Hi @viceice,
I took a look at this and it looks like the issue is that the GitHub repository is not returning the published date. I have put a fix in, PG-1871, which will release in ProGet 5.3.21.
Thanks,
Rich
I'm still waiting for feedback from the products team, but my idea is to basically add a setting that allows you to disable listing/searching the packages on that connector and only allow exact name matches in the UI. We do a similar thing for Docker connectors because not all registries support listing or searching. The product change ticket tracking this fix is PG-1916. I will reply as soon as we have confirmed the fix and scheduled it for release. I'm hoping we can get it in for ProGet 5.3.26.
Thanks,
Rich
Hi @kichikawa_2913,
I just released this extension to production now. Please let me know if you don't see an official release.
Thanks,
Rich
Hi @kichikawa_2913,
We have identified the issue and are working on a solution. The good news is that this is an issue with our InedoCore extensions, so once we correct this, you will be able to just update that extension. I will let you know as soon as we have something for you to try (it should be tomorrow).
Thanks,
Rich
Hi @kichikawa_2913,
I'm going to attempt to recreate your error on my system as well and see if I can find this issue. LDAPS on Docker has not been a popular option with our customers so far due to the complexity in managing the AD certificates. I really appreciate your patience in working through this with us.
While I try to recreate this, could you try using an incognito browser and see if you are able to load and login? Also, can you try to restart your container again? Also, can you please verify the LDAPS connection is still working with the openssl command again?
Thanks,
Rich
Hi @kichikawa_2913,
I'm sorry for not including that in my previous comment. You are correct 1.10.7 is the new version of the InedoCore extension that includes these fixes for LDAP and LDAPS on Docker.
Thanks,
Rich
Hi @kichikawa_2913,
That's great! Thanks for giving me an update. LDAPS is definitely more of an advanced configuration option currently. The hardest piece of LDAPS is getting and keeping the certificate valid and up to date in the Docker container. It depends on how you have your AD server setup, but they typically regenerate their certs and distribute them automatically. We actually use a third-party library Novell.Directory.Ldap.NETStandard for our AD connection in Docker due to the LDAPS support missing from the built-in one for .NET 5.
As I stated before, creating a new image using the ProGet image as your base image tends to be the easiest way to add certificates to your container. Our image is built on top of dotnet/aspnet:5.0.5 (Debian 10 based). There are a handful of ways to do this.
docker exec
to add the certificates after the image has startedStack Overflow can be helpful in adding certificates to your containers for your AD setup. Right now we have not determined a standard way of setting this up since each instance we have dealt with seems to be configured differently, but we always look for feedback in this configuration process.
Thanks,
Rich
Hi @kichikawa_2913,
I have forwarded this over to our licensing team, they should be reaching out to you shortly.
Thanks,
Rich
Hi @kichikawa_2913,
I think I have identified the issue. I have just pushed another version of InedoCore, version 1.10.7-CI.2 . Could you update and give that a try? I also added an option to bypass the LDAPS certificate verification. It is something that I would only use while testing. The solution you have with adding your certificates as valid certs is a more secure solution. One last thing to make sure you set is the Domain Controller Host
. It can just be set to your domain (ex: domain.network using your steps from above). Linux/Docker does not seem to translate domain URLs the same way windows does.
Thanks,
Rich
Hi @kichikawa_2913,
That's great to hear! Just to make sure, you left in the --expose=8080
correct?
For the LDAPS thing, that makes complete sense. The SSL certs are at a root level and there is nothing we can do to change that. But I will note that there is an undocumented feature I added to bypass certificate validation for LDAPS. If you navigate to Administration -> Change User Directory -> Advanced -> Active Directory (NEW) and then select Use LDAPS
and Bypass LDAPS Certificate Validation
, that will allow you to use LDAPS and it just bypasses any certificate errors in the process.
Thanks,
Rich
Hi @kichikawa_2913,
Sorry, this is actually expected. When we release our products, they include the extensions that were released at the time of the product release. In this case, I released this version of InedoCore after we did the product release.
If you look at our documentation for upgrading your docker image, the command includes --volumes-from=proget-old
. This will auto migrate the previous volumes created from the previous version of ProGet and that will keep the updated extension (as long as the previous extensions is newer than the included extension version).
Also, in Administration -> Advanced Settings, you can change Extensions.ExtensionsPath
to a mapped path and that will also do the same thing (if the version in this directory is newer than the included) and give you easier access to the extension files.
Thanks,
Rich
Hi @kichikawa_2913,
I have an idea on how to accomplish this. It looks like Docker allows you to expose ports in the run command. Here is what I'm thinking should work. The first thing to do is to map a volume to /usr/share/Inedo/SharedConfig
. In my example, I'll map to SharedConfig
.
So here would be the steps to try:
echo '<?xml version="1.0" encoding="utf-8"?><InedoAppConfig><ConnectionString Type="SqlServer">'"`$SQL_CONNECTION_STRING"'</ConnectionString><WebServer Enabled="true" Urls="http://*:8080/"/></InedoAppConfig>' > SharedConfig/ProGet.config
podman run -d --userns=keep-id -v proget-packages:/var/proget/packages -v `SharedConfig:/usr/share/Inedo/SharedConfig` -v /etc/pki/ca-trust/source/anchors:/usr/local/share/ca-certificates:ro --expose=8080 -p 8080:8080 --name=proget -e ASPNETCORE_URLS='http://+:8080' -e SQL_CONNECTION_STRING='Server=SERVERNAME;Database=ProGet;User ID=USERNAME;Password=PASSWORD' -e TZ='America/New_York' -i -t proget.inedo.com/productimages/inedo/proget:5.3.32 /bin/bash
Can you please give that a try?
Thanks,
Rich
Hi @cronventis,
Thanks for sending this over to us! I can confirm we have received it and we are currently looking into this. I'll let you know when we have more information.
Thanks,
Rich
Hi @cronventis,
Thanks for confirming thsi for me. I already have the fix in for ProGet 6.0.5 which is expected to release on Friday.
Thanks,
Rich
Hi @NUt ,
Thank you for bringing this to our attention. This bug, PG-2126, will be fixed in ProGet 6.0.12. Going forward it will allow you to configure everything and even test it via the "Test User Directories" button, but it will only allow you to login using the Built-In user directory and the username/password login option when using ProGet free.
Thanks,
Rich
@mcascone said in buildmaster linux docker install: sa login failures:
docker exec -it inedo-sql /opt/mssql-tools/bin/sqlcmd
-S localhost -U SA -P 'redacted'
-Q 'CREATE DATABASE [ProGet] COLLATE SQL_Latin1_General_CP1_CI_AS'
Hi @mcascone,
This line has the issue. You created a database named [ProGet] instead of a database name [BuildMaster] as you stated in the connection string on the third command.
Thanks,
Rich
Hi @cronventis,
Thanks for sending this over. I have found the issue, PG-2064, and have fixed it. It will be released tomorrow in ProGet 6.0.5.
Thanks,
Rich
Hi @mcascone,
Sorry about that. I'll get that backported to ProGet 5.3 also. This will be tracked in ticket PG-2074.
Thanks,
Rich
Hi @cronventis,
I took a look through your tables and there is definitely something a bit odd going on with your Kubernetes output. Could you tell me what version of Kubernetes you are running?
Would it be possible to see the output from your Kubernetes API? Again, this is something you can send us via suppor@inedo.com with the subject of [QA-729] Kubernetes API
. To get the image list, you can simply run this PowerShell against your Kubernetes API:
$uri = [System.Uri]'http://localhost:8080/api/v1/pods?limit=999'
$response = Invoke-RestMethod -Method GET -Uri $uri.ToString()
$response | ConvertTo-JSON | Out-File "C:\temp\response.json"
Just change HTTP://localhost:8080 to your Kubernetes API host and port.
Thanks,
Rich
Hi @cronventis,
I was able to recreate this issue and we should be able to get this corrected in the next release of ProGet. Can you please confirm which version of ProGet you are using?
Thanks,
Rich
Hi @cronventis,
The fix will only prevent new duplicates from being created. Mainly this is because I cannot ensure that the first vulnerability is always the properly assessed vulnerability. For now, the best option will be to run a SQL query directly against SQL Server ProGet database after you upgrade to 6.0.5.
I have created a SQL query that will delete all the duplicates excluding the first vulnerability that was added to ProGet. If that criteria works for you, this query should be good enough.
BEGIN TRANSACTION
DELETE FROM [Vulnerabilities]
WHERE [Vulnerability_Id] in (
SELECT v.[Vulnerability_Id]
FROM [Vulnerabilities] v
INNER JOIN (
SELECT [External_Id]
,[FeedType_Name]
,[VulnerabilitySource_Id]
,COUNT([External_Id]) as [NumberOfDuplicates]
,MIN([Vulnerability_Id]) as [FirstVulnerability]
,MAX([Vulnerability_Id]) as [LastVulnerability]
FROM [Vulnerabilities_Extended]
GROUP BY External_Id, FeedType_Name, VulnerabilitySource_Id
HAVING count(External_Id) > 1
) duplicates on v.External_Id = duplicates.External_Id
WHERE v.Vulnerability_Id != duplicates.[FirstVulnerability]
)
ROLLBACK
Currently, I have the script set to rollback at the end (meaning it won't actually delete the duplicates). If this works for you, you can simply change ROLLBACK
to COMMIT
and rerun the query and it will remove the duplicates.
Please let me know if you have any questions!
Thanks,
Rich
Hi @kichikawa_2913,
Could you please open another topic for the Selenium.WebDriver.ChromeDriver issue? I think that is unrelated to the authentication issue.
Thanks,
Rich
Hi @cronventis,
Thank you for checking that for me. I dug into this further and it looks like we only actually use the Image_Id to find the image within a feed. We expect Kubernetes to return the Config Digest and that is what is stored in the Image_Id column. I'm going to work on creating this scenario and verify what is being pulled from Kubernetes. With the holidays coming up, I will not be able to look into this until next week. You should hear back from us by Tuesday or Wednesday of next week.
Thanks,
Rich
I believe we have identified the issue, PG-2121, that is causing your issue. This fix will be included in ProGet 6.0.11, which is due out next week.
Thanks,
Rich
Hi @phillip-t_2200,
Azure SQL Managed Instances should work fine. We don't have any direct test cases for this, but based on our SQL implementation, that should work just fine.
Thanks,
Rich
Hi @Justinvolved,
That message that says "currently contains 0 items" is just telling you that the folder that the artifact is deploying to contains 0 files. After that, it will deploy the artifact files. If you enable verbose logging (setting Verbose: true
), you will see all the files transferred from the artifact to the working directory.
Thanks,
Rich
Hi @paul_6112,
Thanks for sending this over to us. I have resolved the issue in OT-505 and it will be released this Friday in Otter 2023.2.
Thanks,
Rich
Hi @paul_6112,
Thanks for verifying this for us. We were able to find an issue in our code. This has been fixed in BM-3909 and will be released this Friday in BuildMaster 2023.5.
Thanks,
Rich
Hello,
This is related to a known issue that's been addressed in ProGet 6.0.19 and ProGet 2022.5. So, your best bet is to upgrade and the issue will become resolved :)
This is related to a few packages that have exceeded 2.2 billion downloads:
If the upgrade is impossible/difficult immediately, you can disable the connector as a workaround. Alternatively, you could block those packages with a connector filter and then upload them to your feed so that the counts won't come through the connector.
Thanks,
Rich
Hi @justin_2990,
We are actually in the process of developing dedicated npm operations, but we do have anything ready as of yet. The easiest way to call npm commands is to use the Exec
operation in OtterScript. Due to how the npm CLI writes it's output, you need to add ErrorOutputLogLevel: Warning
to the Exec
operation. Here is an example of the npm install and npm publish commands:
set $NpmPath = C:\Program Files\nodejs\npm.cmd;
set $NodePath = C:\Program Files\nodejs\node.exe;
# Install Dependencies
Exec
(
FileName: $NpmPath,
Arguments: install,
WorkingDirectory: ~\Source,
ErrorOutputLogLevel: Warning
);
# Publish Package
Exec
(
FileName: $NpmPath,
Arguments: publish Source,
WorkingDirectory: ~\,
ErrorOutputLogLevel: Warning
);
When it comes to ProGet::Scan, it should work with all npm packages. It just reads the package-lock.json and records the dependencies in ProGet. You can see our implementation on the pgscan GitHub repository. If that doesn't work, you can always use a tool like CycloneDX to generate an SBOM and upload it to ProGet via the SCA API which has an endpoint for importing an SBOM file directly.
One last thing, you mentioned that you are using ProGet. You can create an OtterScript module to register ProGet as your package source for npm. I do this with the following:
ConfigureNpmRegistry OtterScript Module
##AH:UseTextMode
module ConfigureNpmRegistry<$NpmPath, $ResourceName, $CredentialName>
{
set $ProGetNpmRegistry = $SecureResourceProperty($ResourceName, ServerUrl);
Exec
(
FileName: $NpmPath,
Arguments: config set registry $ProGetNpmRegistry,
WorkingDirectory: ~\,
ErrorOutputLogLevel: Warning
);
set $AuthToken = $SecureCredentialProperty($CredentialName, Token);
PSCall Base64Encode
(
Text: api:$AuthToken,
EncodedText => $AuthKey
);
Exec
(
FileName: $NpmPath,
Arguments: config set always-auth true,
WorkingDirectory: ~\,
ErrorOutputLogLevel: Warning
);
Exec
(
FileName: $NpmPath,
Arguments: config set _auth $AuthKey,
WorkingDirectory: ~\,
ErrorOutputLogLevel: Warning,
LogArguments: false
);
Exec
(
FileName: $NpmPath,
Arguments: config set email support@inedo.com,
WorkingDirectory: ~\,
ErrorOutputLogLevel: Warning
);
}
I also had to add a PowerShell script to handle the base64 encoding of the credentials:
<#
.SYNOPSIS
Base64 Encodes a string
.PARAMETER Text
Text to be encoded
.PARAMETER EncodedText
Encoded text string
#>
param(
[Parameter(Mandatory=$true)]
[string]$Text,
[ref]$EncodedText
)
$Bytes = [System.Text.Encoding]::UTF8.GetBytes($Text)
$EncodedText =[Convert]::ToBase64String($Bytes)
I then call this using:
# Setup registry
call ConfigureNpmRegistry
(
NpmPath: $NpmPath,
ResourceName: global::ProGetNpmRepo,
CredentialName: global::ProGetNpmCredentials
);
These are all operations we plan to build into the npm extension, but these are currently the workaround until we get that extension up and running. I hope this helps! Please let me know if you have any questions.
Thanks,
Rich
Alana was correct, the change was not merged into the 2023 release. The fix, PG-2350, will be released on Friday in ProGet 2023.4. If you need it earlier than Friday, I can push a pre-release version of ProGet 2023.4 for you. Please let me know!
Thanks,
Rich
Hi @MY_9476,
Thanks for bringing this to our attention. I added a ticket, OT-502, to fix the issue. This should be released next week in Otter 2023.2.
Thanks,
Rich
Hi @forbzie22_0253,
There is no way to split out each feed to have different app pools. The only way to accomplish that is to have multiple instances of ProGet where each instance has a different feed. That would require a separate license for each instance.
Thanks,
Rich
Hi @Justinvolved,
The easiest way to setup a test environment for this would be to setup an instance of Otter (free edition is fine). Then once you have checked out https://github.com/Inedo/inedox-windows and made your changes, you can package the extension using the Inedo Extension Packager. This is available as a .NET tool. You can then navigate to the extensions page and upload the extension file to Otter. You may need to modify the AssemblyVersion
in AssemblyInfo.cs
to a version newer than the installed version to get it to pick it up as the lastest. Alternatively, you can copy that extension file to the Extensions.ExtensionsPath
and restart Otter to have it pick up as well.
The command I typically run to package the extension is:
inedoxpack pack InedoExtension Windows.upack -o --build=Debug
I run that command from the the solution file's directory.
Hope this helps! If you have any questions, please let me know.
Thanks,
Rich