Hi @arozanski_1087,
This is really peculiar because we have an explicit filter on all of our LDAP queries to be a user, msDS-GroupManagedServiceAccount, or group. Does each user account have their own group managed service account?
Thanks,
Rich
Hi @arozanski_1087,
This is really peculiar because we have an explicit filter on all of our LDAP queries to be a user, msDS-GroupManagedServiceAccount, or group. Does each user account have their own group managed service account?
Thanks,
Rich
Could you try disabling BitDefender temporarily to rule that out? Also, this could simply be related to the fact that Windows 7 is no longer in support and their may be a security patch to the .Net Framework which is causing the error. Since Windows 7 no longer gets updates, it would not get any new patches for the .Net Framework or the FTP server. We leverage the classes built-in to the .Net Framework for FTP connections.
Thanks,
Rich
Hi @arozanski_1087,
I didn't see this answer before, but are you using LDAP or Single Domain Active Directory (Legacy)
or Active Directory (LDAP)
?
Thanks,
Rich
Thanks for confirming the permissions for me. As for adding the '' character in front of the folder names, I cannot see anywhere in our code that would do that. I do see that ServerPath
will prepend a '/' character, but it is clear that the file list is working, so that would not be an issue.
I have been searching for the server returned 550 the specified network name is no longer available
and I am seeing quite a lot of people saying this is actually an issue with Anti-Virus on servers or web protection in a Firewall or Network Proxy. This is an error coming directly from the FTP server. Do you have an anti-virus installed on the FTP server? If so, which one? Do you have BuildMaster configured to use a proxy? I see a lot of people complaining about this issue with AVG's resident shield.
@nkerifacclaud_6931 said in Inquiry about buildmaster FTP plugin:
Concerning the phrase "I would also check that your Get-Files operation is not happening until after the files have been put on the FTP server." i don't quite understand. Maybe you can explain again what you are expecting from my side.
Sorry about the confusion on this. I meant whatever tool is copying the files to the FTP server, just making sure the transfer has completed prior to trying to pull the files. Some FTP servers will list a file if a file is currently in transit, but not yet complete. Scenarios this can happen is with CI/CD tools (like BuildMaaster, TeamCity, Jenkins, Azure DevOps, etc...). When those tools are executing operations, scripts, pipelines, etc... in parallel, it can start the FTP file transfer and kick off a pull of these files at the same time. This is more to just rule that out as a possibility.
Hi @arozanski_1087,
Thanks for following up with us. I'm glad you got this working! Please don't hesitate to reach out if you have any other questions for us.
Thanks,
Rich
Hi @sbolisetty_3792,
I apologize for the delay in this response, but somehow my response never posted. I have assigned this to one of my colleagues to handle identifying the issue and planning the fix. @atripp will take this over from here and she will get back to you when an official fix has been scheduled. Thanks for all the help in troubleshooting this with me!
Thanks,
Rich
Good catch, I'm sorry for that, I actually did mean **
for the include property. The error is an interesting error though. I have seen this error commonly occur in two scenarios; 1. The file is deleted prior to the copy step (unlikely but possible) or 2. You have the permissions to list folder contents, but you do not have the permission to read the files. Can you verify that the account you are using for the FTP server has access to read the files on the FTP server? I would also check that your Get-Files operation is not happening until after the files have been put on the FTP server.
Thanks,
Rich
I'm not sure I fully follow. Are you trying to download the entire contents of Folder? If so, use the Get-Files
and set your ServerPath
to /Folder
and then Include
to *
. This should download the entire contents of the folder and its child folders and files. Is that what you are doing now?
Thanks,
Rich
Hi @arozanski_1087,
The errors you are showing here looks like a failure connecting to AD as a whole. One cannot connect to teh AD server at all, which indicates the server is not operational, a bad AD certificate, or a your DNS server still has a the IP address listed for your old domain controller tied to your local domain url (ex: mydomain.local).
The other error message shows a login error for network service. This could need the Set-SPN operation ran to give the server access to the domain, or you need to sepcify the login credentials in the AD settings.
You can reset your selected user directory back to the built in one using the resetadminpassword operation from the Service executable. You can then correct the issue and switch it back to your LDAP directory.
Thanks,
Rich
Can you give me a little more detail in what you are trying to do? From what I can see you could specify your ServerPath
as the folder you are looking to pull and include all files. Are you specifically looking for the include/exclude masks to work with folders? Or are you looking for something else entirely?
Thanks,
Rich
I have dug through our connector code and this looks to be an issue with the Azure DevOps' implementation of the NuGet v3 API, specificially their RegistrationBaseUrl. If you notice in my example above, the dependencies
property is an empty array. This is what we are expecting. Since Azure DevOps is actually populatating it with an object, we attempt to parse the dependency, and then the id
property is empty which is causing our code to fail. I would submit this as an issue to Azure DevOps. I don't think this is an issue with your NuGet package considering it has the nuspec properly defined and works fine when uploading to ProGet.
Thanks,
Rich
Hi @arozanski_1087,
I'm sure we have asked you this before, but which LDAP user directory are you using?
Thanks,
Rich
Hi @Crimrose,
What version of the ProGet image are you running? Would you be able send over your docker start command? Based on your error message, it sounds like you might have flipped the port mapping in the stat command. When using docker you should be able to map the internal port 80 to any external port on your Docker host that is open.
Thanks,
Rich
Could you try uploading you .nupkg
directly into ProGet and see if you get any error?
Thanks,
Rich
Starting in version 6.2.21, we actually made a new variable function to create the URL for you. You can use the $BuildMasterUrl(build)
function and that will generate the URL. If you are running that from within the context of a build, you will not need to specify any other parmaters. You can see more about its usage in the functions reference in BuildMaster by navigating to Administration -> Reference Documentation section -> Functions
. We have also added a $BuildMasterId
function if you need to get the BuildId directly.
Hope this helps!
Thanks,
Rich
This is most certainly interesting to me. This does not seem to be an issue with ProGet from what I can see. I created a simple example based on Newtonsoft.json. I created nuspec file that looks like this:
<?xml version="1.0" encoding="utf-8"?>
<package xmlns="http://schemas.microsoft.com/packaging/2013/05/nuspec.xsd">
<metadata minClientVersion="2.12">
<id>Newtonsoft.Json</id>
<version>13.0.0</version>
<title>Json.NET</title>
<authors>James Newton-King</authors>
<owners>James Newton-King</owners>
<requireLicenseAcceptance>false</requireLicenseAcceptance>
<license type="expression">MIT</license>
<licenseUrl>https://licenses.nuget.org/MIT</licenseUrl>
<icon>packageIcon.png</icon>
<projectUrl>https://www.newtonsoft.com/json</projectUrl>
<description>Json.NET is a popular high-performance JSON framework for .NET</description>
<copyright>Copyright © James Newton-King 2008</copyright>
<tags>json</tags>
<repository type="git" url="https://github.com/JamesNK/Newtonsoft.Json" commit="7c3d7f8da7e35dde8fa74188b0decff70f8f10e3" />
<dependencies>
<group targetFramework=".NETStandard2.0" />
</dependencies>
</metadata>
</package>
I navigated to the URL you specified aboved and got this:
{
"count": 1,
"items": [
{
"count": 1,
"items": [
{
"@id": "http://proget.localhost/nuget/NoConnPublic/v3/registrations-gz/newtonsoft.json/13.0.0.json",
"@type": "Package",
"catalogEntry": {
"@id": "http://proget.localhost/nuget/NoConnPublic/v3/catalog/newtonsoft.json/13.0.0.json",
"@type": "PackageDetails",
"authors": "James Newton-King",
"dependencyGroups": [
{
"targetFramework": ".NETStandard2.0",
"dependencies": []
}
],
"description": "Json.NET is a popular high-performance JSON framework for .NET",
"id": "Newtonsoft.Json",
"licenseUrl": "https://licenses.nuget.org/MIT",
"licenseExpression": "MIT",
"minClientVersion": "2.12",
"projectUrl": "https://www.newtonsoft.com/json",
"published": "2020-11-06T20:09:19.47Z",
"tags": "json",
"title": "Json.NET",
"version": "13.0.0"
},
"packageContent": "http://proget.localhost/nuget/NoConnPublic/v3/flatcontainer/Newtonsoft.Json/13.0.0/Newtonsoft.Json.13.0.0.nupkg",
"registration": "http://proget.localhost/nuget/NoConnPublic/v3/registrations-gz/newtonsoft.json/index.json"
}
],
"parent": "http://proget.localhost/nuget/NoConnPublic/v3/registrations-gz/newtonsoft.json/index.json",
"lower": "13.0.0",
"upper": "13.0.0"
}
]
}
My guess is this is most likely an issue with the AzureDevOps registry returning the dependencyGroups
incorrectly through the connector. The fact that id
and range
are specified but do not have a value is definitely what is breaking ProGet because if it is specified we expect it to exist. Are you able to get that registration JSON directly from Azure DevOps' registry and see if you see that empty id
and range
in there?
Thanks,
Rich
Hi @sbolisetty_3792,
What version of BuildMaster are you running? It looks like the third-party Slack extension was built using the legacy SDK and extension model, which will not work in BuildMaster 6.2.
We recently added documentation on how to POST to Micorosft Team's Webhook connectors. You could very easily adapt this to Slack by creating a new WebHook in Slack and posting to the Slack WebHook URL instead of the Teams Web Hook URL and tweaking the message text you are POSTing.
Here would be an example Slack JSON to POST:
{"text": "**BuildMaster - New Build** $ApplicationName $ReleaseNumber.$BuildNumber has completed. View it at [Build $BuildNumber]($BuildMasterUrl(release))."}
Note: $BuildMasterUrl() is only available in 6.2.21+
Hope this helps!
Thanks,
Rich
Hi @viceice,
I did a few searches and have you tried setting the hostsProxyHeaders
> Maybe something like this: traefik.frontend.headers.hostsProxyHeaders=www-authenticate
? The docs make it sound like it will look in those headers for URLs that are proxied. You could give that a shot.
Thanks,
Rich
Hi @bju_2095,
Are you using a Nuget.config file for specifying your multiple feeds? From what I can see is Azure DevOps wants you to use the NuGet.config when specifying multiple NuGet feeds. I would lean towards this being an issue with Azure DevOps. Can I confirm that you are using ProGet 3.2? You could create a new feed in ProGet with connectors pointing to the existing two feeds you need. Then Azure DevOps would only need to connect to one feed.
Thanks,
Rich
Hi @harald-somnes-hanssen_2204,
The fix is scheduled for release in ProGet 5.3.16 which is due out on Friday. I'll reply back if anything changes.
Thanks,
Rich
Hi @bju_2095,
It looks like Azure DevOps is looking to use a username and password with ProGet but the username is not set. If you are using an API key, then you would need to use api
for the username and then your API key as the password. Is that what you are currently using to connect to ProGet?
Thanks,
Rich
Hi @viceice,
Thanks for sending this over. I will be debugging this issue more this week.
Thanks,
Rich
Hi @viceice,
Thanks for testing this for me. I think I may see your issue. When the Web.BaseUrl
is set to https://proget-test.kriese.eu
, the connector should use the URL https://proget-test.kriese.eu/nuget/b/v3/inedx.json
. If you set it up that way, do you still get the license violations?
I'll look into the license key issue. It should work as the documentation describes it.
Thanks,
Rich
Hi @harald-somnes-hanssen_2204,
Thanks for following up with this. I have currently updated that documentation to clarify that the Free/Open Source Packages
Feed usage requires at least one connector configured before packages will be displayed. Hope this helps!
Thanks,
Rich
Hi @viceice,
Looking back at your other forums post. The last fix we released should allow you to set the Web.BaseUrl and with the addition of X-Forward* headers in your reverse proxy, it should not give you those license errors any more? Were you able to verify that it still does give oyu the licenses errors?
Thanks,
Rich
Hi @arozanski_1087,
Technically they do not need to run as the same account, but it does simplify things. Especially since they share the configuration file that contains the connection string. Also, please make sure that your IIS has the correct features installed. You can see the Configuring IIS Roles & Features for Inedo Products documentation for that information. Were you using that exact service account previously on the integrated web server install? If you were there shouldn't be any iossue connecting from IIS then.
Thanks,
Rich
@viceice Good catch! @csyy321_2677 Could you update your port as @viceice has described and let me know if that works?
Thanks,
Rich
Hi @arozanski_1087,
Thanks for clarifying that for me. You will also need to make the ProGet windows service run as that account also.
Your SQL Server Connection string should look similar to this:
Persist Security Info=False;Integrated Security=true;
Initial Catalog=ProGet;Server=<SQL SErver Name>
When you add your service account to SQL server, are you using the UI or are you using a SQL script? I ran into some weird issues using a service account where the only way SQL Server would format it correctly was by adding it through the UI. This was probably due to the unique environment at that company, but I have seen that happen.
As long as the Application Pool and Windows Service run as that account and you add that service account to have the ProGet database access as you listed above, there should not be any issues authenticating. I have also seen some companies that have GPO require users to run as a service to be setup within Group Policy. Can you verify you do not have any run as a service restrictions? You can also try to add that user to run as a service locally on that computer (you don't typically need that, but service accounts operate slightly different.)
Thanks,
Rich
Hi @arozanski_1087,
I apologize, can you please clarify this for me? Are you referring to ProGet connecting to SQL Server altogether or a user logging in with integrated authentication?
If it is ProGet connecting to SQL Server, please check the user on the Application Pool in IIS has access to the database. You may either need to change what user the application pool is running as or you will need to add the application pool user to the SQL Server database (ex: IIS APPPOOL\ProGet
) You can only add an App Pool user if SQL Server in on the same server).
Thanks,
Rich
Hi @arozanski_1087,
It may be easier to disable Integrate Windows Authentication and then try to migrate it to IIS. If you do that, does everything work? Or do you see errors? If you see errors, what are they?
Thanks,
Rich
Hi @sbolisetty_3792,
Thanks for sending this over. I see what is happening but I need a little bit of time to diagnose it furthur. It looks like the issue is when you sepcify an environment on the Secure credential. If you remove the environment on the secure credential (but leave it on the server), it will work fine. I'm going to dig in furthur to determine what is going on. I'll let you know when I have more information.
Thanks,
Rich
Hi @viceice,
This is very weird to me. When you call docker login, you said it is initially doing a get to http://proget-test.kriese.eu/v2/
not https://proget-test.kriese.eu/v2/
. This means that there is something configured in your docker client to not use https in docker requests. The reason ProGet is returning an http:// based realm is because the initial request is being made non-https URL.
Can you send a screenshot of your command output in the docker login process?
Thanks,
Rich
Hi @sbolisetty_3792,
Would you be able to share a screenshot of your Private Key credentials modal and your server configuration modal? Also, do you have an environment configured on either your server or your private key?
Thanks,
Rich
Hi @viceice,
What about your docker daemon.json? By default, docker will use https, unless if the host is listed within the insecure-registries in your docker daemon.json. Can you please verify that URL does not exist in that list?
Thanks,
Rich
Hi @aneequaye_1868,
I apologize, I was speaking from the mapped folder on the Docker host. Inside the container, it is still /var/proget/packages
. Does this still happen on the latest version of ProGet? We did make some changes to how the headers are returned in the latest version, especially for feeds that require authentication.
Thanks,
Rich
Hi @viceice,
I was just about to link you to another forums post about X-Forward-* headers, but I realize that post was with you. Just out of curiosity. Are you using a different URL for your docker feed than your nuget feed? Also in your docker client, do you have your URL registered as an insecure registry?
Thanks,
Rich
Hi @viceice,
Thanks for the information. Let me dig into this a bit further and let me work on recreating it. Hang tight!
Thanks,
Rich
Hello @viceice,
What version of ProGet are you running? There was a bug that was fixed on ProGet 5.3.15 that corrected the headers in the docker login process when using the proget
image. Also, please make sure that you set the Web.BaseUrl
to the https reverse proxy URL in the Administration -> Advanced Settings section.
Thanks,
Rich
Hi @viceice,
How many docker feeds do you have configured on the fresh ProGet instance? I have seen one other user who has seen a similar issue, but it only happened on the one feed and manually creating the folder has resolved it. Any other docker feeds that were created did not seem to have the issue. Also, how did create the folder? Did you create it from within the ProGet container?
Thanks,
Rich
If you navigate to that artifact version in the UI and verify that cdd-application-5.7.0.5.zip
exists in the list of files? If it does, can you click the file and see if it downloads manually for you?
Thanks,
Rich
Do you have any other applications connecting to your SQL enterprise instance? Do you see any timeouts in those applications? Would you be able to run a docker stats
when these timeouts are occuring? Could you also up your timeout in ProGet's SQL Server connection string?
Also, what version of SQL server are you running?
Thanks,
Rich
Sorry for the late notice but the release of 5.3.14 was pushed back to later today. I'm sorry if this caused inconvieninces for you.
Thanks,
Rich
Hi @srbauti_9412,
The best solution to this would be to create a new NuGet feed that uses a self-connector to connect back to your existing NuGet feeds. You can then use the connector filters to handle filtering by your package name format.
Thanks,
Rich
Hi @aneequaye_1868 & @informatique_1703,
I may have found a potential cause for this. Can you verify a feed folder exists in your /proget-packages/
? You should see proget-packages/.docker/common
and/or /proget-packages/.docker/F<Feed ID>
(ex: proget-packages/.docker/F1
).
Thanks,
Rich
Currently, the only way to set the description would be to do it using the UI or the Native API. In order to set it using the Native API, you would need to do the following:
First, Get the Docker Repository ID by making an HTTP GET
request to <Base URL>/api/json/DockerImages_GetRepositoryByName?key=<API Key>&Feed_Id=<Feed_ID>&Repository_Name=<Repository Name>
This will return a list of images (should only be 1 in the list) and you will need to get the DockerRepository_Id
from each item.
Then, you will need to HTTP POST
a JSON object to <Base URL>/api/json/DockerImages_CreateOrUpdateRepository
.
The JSON object you would post would be:
{
"API_Key" : "<API Key>",
"Feed_Id": "<Feed Id>",
"DockerRepository_Id": "<ID from previous request>",
"Repository_Name": "<Repository Name>",
"RepositoryIcon_Url": "<Icon_URL>",
"ShortDescription_Text": "<Short Description>",
"FullDescription_Text": "<Full description (Readme.md)>"
}
Please note that you will need to populate all of the values. For example, if you do not populate the RepositoryIcon_Url, the value will be cleared.
Thanks,
Rich
Hi @toni-wenzel_6045,
Currently, the only way to set the description would be to do it using the UI or the Native API. In order to set it using the Native API, you would need to do the following:
First, Get the Docker Repository ID by making a GET
request to <Base URL>/api/json/DockerImages_GetRepositoryByName?key=<API Key>&Feed_Id=<Feed_ID>&Repository_Name=<Repository Name>
This will return a list of images (should only be 1 in the list) and you will need to get the DockerRepository_Id
from each item.
Then, you will need to POST
a JSON object to <Base URL>/api/json/DockerImages_CreateOrUpdateRepository
.
The JSON object you would post would be:
{
"API_Key" : "<API Key>",
"Feed_Id": "<Feed Id>",
"DockerRepository_Id": "<ID from previous request>",
"Repository_Name": "<Repository Name>",
"RepositoryIcon_Url": "<Icon_URL>",
"ShortDescription_Text": "<Short Description>",
"FullDescription_Text": "<Full description (Readme.md)>"
}
Please note that you will need to populate all of the values. For example, if you do not populate the RepositoryIcon_Url, the value will be cleared.
Thanks,
Rich
Hi @csyy321_2677,
What version of CENTOS is your Docker host running on? Can you also tell me what version of docker is installed?
Thanks,
Rich
No problem! If you are still having issues after you reach out, please feel free to reply back to this topic.
Thanks,
Rich
I believe Azure DevOps NuGet feeds depricated alternate credentials back in March and they will only support PATs moving forward. Here is an article on the deprication Azure DevOps Will No Longer Support Alternate Credentials.
I know you were sepcifying your Azure AD username and password, but I believe Microsoft treated those as an alternate method because the normal Azure AD login requires OAuth based logins moving forward. The PAT was their solution for adding a way to authenticate without requiring a user to intervene in the login process.
Thanks,
Rich