What user principle are you running the Inedo Agent under?
The default is LOCAL SYSTEM.
What user principle are you running the Inedo Agent under?
The default is LOCAL SYSTEM.
Unfortunately we didn't totally understand that detail either when implementing the feed, either... so unfortunately it's not trivial to fix.
We'd like to gauge the impact of not changing it; aside from this search odditiy, was there any other problems? Are packages not installing?
I'm not very familiar with PyPi packages, but I know there are some oddities with - and _, and that they are sometimes supposed to be treated the same, and sometimes not. We don't totally understand all the rules, to be honest (even after reading PEP503 specifications).
In this case, the package is actually websocket_client, not websocket-client.
See: https://pypi.org/project/websocket_client/
When you search for websocket_client in ProGet, it shows up, as expected.
Hello; I think this may have already been addressed in PG-1477, which was shipped in ProGet 5.2.0.
Can you try upgrading to latest version and try again?
Hi;
It's hard to say why this is happening, if you don't see it in the UI, you souldn't see it in Visual Studio.
it could be related to cached packages in visual studio. The best way to diagnose this would be attach Fiddler or Wireshark, so that Visual STudio is going through that, and then monitor the exact queries that Visual STudio is sending to ProGet, and find if anyone is actually returning those packages. If so, then please share the details and we can try to investigate.
Otherwise clear all of your local NuGet caches.
Best,
Alana
I can't say why it's like that, probably because NuGet.org didn't have it a long time ago?
One important note; changing this could break a lot of the tools we do support and integrate with --- which include older versions of NuGet client that NuGet.org no longer supports.
Can you bring this up with Dependabot team? It should be a totally trivial fix on their end, and it's not necessary at all for them to use it.
Hello;
What are you trying to do?
This definitely won't work for several reasons.
Ensure-File is an "Ensure" operation and will detect configuration drift and can optionally create the file if it doesn't exist; but in this case, it would be a 0-byte file named test.zip, which isn't a valid zip
Extract-ZipFile is a "Execute-only" operation, which means it will run only within a block (e.g. { ... } ) if drift was detected (i.e. the file didn't exist in the operation before); but if that was the case, it would fail because it's not a valid zip file
You probably want to use an Orchestration plan for this.
Please read through this eBook to get feeling for how configuration and Orchestration Plans: https://inedo.com/support/resources/ebooks/windows-first-iac-cca
Hi there;
ProGet is quite optimized as a far as software goes (check out ProGet Case Studies to see the scale that enterprises use it at), and a lot of our users have switched from competitive tools for massive performance gains that our tools have. But you need to put it on proper hardware.
Keep in mind that NuGet tools were designed to operate against NuGet.org, which is run on a massive server farm that sends primarily static content. You are making 1000's of requests, and each request to ProGet is comparatively expensive, because it proxies those requests to NuGet.org (assuming you have a connector loaded), makes database connections, it checks permissions, vulnerabilities, etc.
Each request to the server can result in several subsequent network requests... and it sounds like your ProGet server is a Win10 desktop... there's just no way it's going to keep up with developers hammering it with more powerful workstations.
Check out ProGet Free to ProGet Enterprise to see the performance recommendations we have, and other reasons organizations upgrade.
Hello; unfortunately we hit a few snags in getting the environment and PoC code running (a bit more variety in R/CRAN packages), and then we ran out of budgeted time :(
But it's still definitely on our roadmap and we're going to take another stab at it in NOvember.
Hi @gravufo sorry on the delay; this fell thru the cracks on prioritization, but it's been merged just now and will be shipped into ProGet 5.2.15, shipping next Friday.
Thanks!
So to be clear, is this in a "configuration plan" or an "orchestration plan"? Can you give some more context as to how you're trying to use this (i.e. why do you want to unzip)? Can you share the full plan?
Hello; I've updated the documentation to clarify this, but it's available starting in ProGet 5.2.9. So, you'll need to upgrade to enable it :)
hi; please share your OtterScript and we might be able to get a better idea :)
Hello; when you go to add a server (Servers page > Add) to Otter, just select "WSMan / Powershell" as the agent type, and then you'll be able to enter the details for a server that has PowerShell remoting enabled.
Does that help?
Hi; that message is coming from the operating system and ProGet does not have any sort of limit (we have customers with terrabytes of packages); perhaps you've configured it to a different drive, or something. I'd check there :)
Hello; for this, or really any other Dockerized web application, you can setup a reverse proxy. There are a ton of ways to do this, but I'd suggest to search for "How to Configure a Nginx as a HTTPS Reverse Proxy" and see which option(s) make sense for your environment
This would be captured in an event (Admin > ProGet Event Log); the UI doesn't currently support package-level queries (you can filter a few other ways), but you could do a database query pretty easily...
SELECT *
FROM [EventOccurrences]
WHERE [Event_Code] = 'PKGADD'
AND [Feed_Id] = 2
AND [Details_Xml].value('(/Details/@Package_Name)[1]', 'varchar(max)') LIKE 'Carbon'
AND [Details_Xml].value('(/Details/@Package_Version)[1]', 'varchar(max)') LIKE '2.9.0'
Hello; all versions of ProGet support gMSA. From the application (ProGet's) perspective, it's just a regular user.
I'll start with the easy one; connectors are "read only", there is no mechanism to push a package to a connector in ProGet.
As for authentication, I really don't know. That's weird, and it's definitely nothing in ProGet.
The 401 could be coming from a proxy server (inside your network) or from registry.npmjs.org itself, and there's no way to know within ProGet which it is.
I can't imagine it's a proxy server that somehow works with a npmjs.org username/password. But, maybe it is.
I don't know why registry.nmpjs.org would send you a 401. Other users haven't experienced this, and I just tested it on a brand new instance myself... and the connector worked just fine. There is talk of throttling large organizations, so maybe it's related to them blocking your IP or something?
It could have also just been a temporary glitch, somewere. Maybe if you remove authentication, it will work.
ABout the only way to diagnose this would be to attach ProGet to a tool like Fiddler or WireShark, so you can see the requests PRoGet makes, then reproduce those exact same requests, and either reach out to your internal IT folks or try your luck with npmjs.
Hello;
Your best metric is going to be your users; if they're complaining it's slow, then it's probably too slow. My guess is that it's intermittant (during peak times).
But in any case, it's not about the number of packages (a micro server can easily serve millions of packages nearly instantly), it's about the number of simultaneous connections (requests) to the server, and the features you've enabled (like license checking, etc).
It sounds like you've got a single "2 core instance with 8gb of ram" server; while you can try to increase the hardware, the bottleneck is most likely network related; a single server might not just be enough to keep up with the traffic from your developers. Keep in mind that "plain old NPM" (i.e. registry.npmjs.org) runs on a massive, dedicated data center and is heavily cached (basically read-only).
The ProGet Free to ProGet Enterprise article contains some general guidance that might help answer your questions:
You can try disabling connectors, disabling license checking, disabling vulnerability scanning -- but those are really important features, so your best bet will be to plan for a load-balanced scenario.
See How to Prevent Server Overload in ProGet to learn more.
Hello; this is the first such request for GCP. These are extensible components, and it's possible we can support it very easily -- but we like to partner with customers, since we have very little knowledge about GCP and like to make sure it works in the field before making it public.
The best way to get this rolling would be to submit a feature request, and ideally bring it up with your technical account manager to see if it can get escalated.
Hello; thank for the report I was able to confirm this is an issue (OT-350), and we'll get it fixed ASAP! Perhaps even this next maintenance release
In the mean time, you can manual append &version=X to the querystring if you need to see the previous version.
The uninstaller found within the traditional installer (i.e. not the Inedo Hub) can be a bit tricky to debug... this is one of the many reasons behind building the Inedo Hub. We will likely only support the Hub next year.
If you're not getting any errors when you run the installer, the easiest way to diagnose the uninstaller is by following the code. It may be a missing registry key, permissions, etc.
You can also use the code to see exactly what needs to be uninstalled, should you need to do it manually.
public static void Uninstall(UninstallOptions options)
{
string servicePath;
string webPath;
try
{
GetRegistryInfo(out servicePath, out webPath);
}
catch
{
return;
}
string connectionString;
try
{
GetServiceInfo(servicePath, out connectionString);
}
catch
{
return;
}
StopService("INEDOOTTERSVC");
StopService("INEDOOTTERWEBSVC");
RunProcess(Path.Combine(servicePath, "Otter.Service.exe"), "uninstall");
RunProcess(Path.Combine(servicePath, "Otter.Service.exe"), "uninstallweb");
try { IIS.Current.DeleteWebSite("Otter"); }
catch { }
try { IIS.Current.DeleteAppPool("OtterAppPool"); }
catch { }
Thread.Sleep(5000);
DeleteDirectory(webPath);
DeleteDirectory(servicePath);
if (options.DeleteDatabase && !string.IsNullOrWhiteSpace(connectionString))
{
try
{
var connStringBuilder = new SqlConnectionStringBuilder(connectionString);
var dbName = connStringBuilder.InitialCatalog;
if (!string.IsNullOrWhiteSpace(dbName))
{
connStringBuilder.InitialCatalog = string.Empty;
using (var conn = new SqlConnection(connStringBuilder.ToString()))
{
conn.Open();
using (var cmd = new SqlCommand(string.Format("DROP DATABASE [{0}]", dbName), conn))
{
cmd.ExecuteNonQuery();
}
}
}
}
catch
{
}
}
Registry.LocalMachine.DeleteSubKeyTree(@"SOFTWARE\Inedo\Otter", false);
Registry.LocalMachine.DeleteSubKeyTree(@"SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall\InedoOtter", false);
DeleteDirectory(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.CommonStartMenu), @"Inedo\Otter"));
try { Directory.Delete(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.CommonStartMenu), "Inedo"), false); }
catch { }
FinishUninstall();
}
private static void GetRegistryInfo(out string servicePath, out string webPath)
{
using (var key = Registry.LocalMachine.OpenSubKey(@"SOFTWARE\Inedo\Otter", false))
{
servicePath = (string)key.GetValue("ServicePath");
webPath = (string)key.GetValue("WebPath");
}
}
private static void GetServiceInfo(string servicePath, out string connectionString)
{
var xdoc = XDocument.Load(Path.Combine(servicePath, "Otter.Service.exe.config"));
connectionString = xdoc
.Element("configuration")
.Element("appSettings")
.Elements("add")
.Where(s => (string)s.Attribute("key") == "InedoLib.DbConnectionString")
.Select(s => (string)s.Attribute("value"))
.First();
}
We don't have a bug bounty program, but please submit a ticket if you find anything and we will immediately address it.
Hi Fabrice, can you open a feature request for this? It's a ticket, and we can more easily track/prioritize it on our end this way.
We generally avoid just adding "pagination and order by" in data lists because it usually means you're having to click through page after page and play around with order to find what you're looking for. We'd like to make it much easier :)
So to do that, we'd love to learn more about the use case, because we have a feature on our roadmap to give a lot more insight into "which project uses a component", and I think you're definitely on the right track by using those headers.
So screenshots, etc., would be great (and having this tracked in that feature request is so much simpler for us to manage internally).
We can update this post once we have more insight, and if anyone is curious we can just reply with more info here :)
hi Jonathan, can you help me with a test case on the duplicate server issue?
Are you trying do an infrastructure import that has the same server name defined twice in the same JSON document?
hi Jonathan,
I haven't forgotten about this, but are finally clearing out our backlog,...
What type of errors are you encountering?
Can you be more specific about the "overwrite existing"? What are you importing, and what would you like? Currently it should be "merged" for the most part, depending on what you're wanting to merge...
Alana
We have a lot of customers who host an instance of ProGet in the cloud, but we do not offer hosting at this time for a number of reasons:
As far as how to host a ProGet instance in the cloud, you would be safe simply using built-in authentication and installing it on a public-facing server with only the HTTPS port open. "Anyone" could access the log-in page, and any feeds that you give "Anonymous" access to, but on it's own, ProGet has been pen-tested and will be secure.
You could also do any of the following:
Those are progressively more difficult to configure, so it just depends on how much effort you want in a POC.
Note: we don't recommend regular HTTP, because then a "man-in-the-middle" attack could capture authentication information. But if it's only a POC, maybe it's ok.
Finally, some articles for your consideration;
hi Charlie,
Can you provide us with a very specific reproduction case?
Here is a very specific case where it's working as expected...
{
"name": "dfhack-test-1",
"version": "0.44.12-r2.p190516000",
"contents": [
{
"type": "virtualDirectory",
"source": "dwarffortress/core/linux64:0.44.12"
},
{
"type": "virtualDirectory",
"source": "dfhack/core/linux64:0.44.12-r2"
},
{
"type": "virtualDirectory",
"virtualPath": "hack/plugins",
"source": "dfhack/plugin/weblegends/linux64-pre:0.44.12-r2.p190516000"
},
{
"type": "virtualDirectory",
"virtualPath": "hack/plugins",
"source": "dfhack/plugin/df-ai/linux64-pre:0.44.12-r2.p190516000"
}
]
}
You can see this package live at https://proget.lubar.me/feeds/DwarfFortress-prerelease/dfhack-test-1/0.44.12-r2.p190516000
Not as of ProGet 5.1, but this is definitely on our roadmap and we plan to do it likely in 5.2 or 5.3.
Ah, I'm glad to hear it :)
Romp should be able to download packages directly, then install them? Just have Romp point to a feed source instead of a local file on disk
Anyways this usecase sounds really awesome, I would love to learn more about it once you get it closer to working.
That definitely sounds Windows Authentication related... do you have a site in IIS setup that doesn't have Windows Auth enabled? That might be the way to go.
Unfortunately containers (even windows one) don't seem to work well with Kerberors/Windows Auth...
Eventually, but we haven't had too many requests for it :)
If this is something you're interested in, we'd love to partner with a customer or two to help get this developed.
I'm not 100% sure I understand, but when I see "small functions of reusable code", the first thing that comes to mind are Modules. These are bits of OtterScript that you can from OtterScript using the call statement.
module Get-DomainMembership<$Param1, $OptionalParam2 = default, out $OutParam>
{
set $outParam = hello;
}
call Get-DomainMembership
(
Param1: some value,
Param2 => $SomeVariable
);
And then there are also Script Assets, which are PowerShell scripts that you call using the pscall operation from within OtterScript.
Is this helpful?
Good news, thanks to some data from customers enabling our CEIP, we've identified some problems and will fix them ASAP. Please wait for a new version on Friday of this week :)
Try stopping the service before running the installer, then it should work.
Got it! Then this would likely fall inline with an upcoming major feature, and the communication between Otter servers ("main" and "secondary" as you describe) would be done using API/HTTPS/Web requests, in a direction of ("Publisher" -> "Listener").
Basically, you would create a Git-based raft in both instances of Otter, and have them both point to the same Git repository.
The biggest gotcha would be that both instances can potentially update the raft, but as long as you're aware of that it wouldn't be a problem.
I see; in this cases, go to Advanced Settings, and change the base url property.
You can edit the connection string using Inedo Hub; it should be listed right under the Configuration tab. Note you will need to stop/start the service and web app afterwards.
Might be best to stop both beforehand, however, so you don't have two potentially active databases.
How are you performing a feed migration? Is this being done by feed replication?
If so, the biggest consideration will be data. If you have terabytes of data to transfer, it's going to take a long time and probably kill your bandwidth. You could always use the drop-path feature, and then import from a physical disk you move/send?
Not any useful ETAs, but it's something we definitely want to accomplish this year (this has been a very long-standing request), just a matter of prioritizing which part of the year it gets done in, etc.
I'm not familiar with SCCM so well, but I think I understand. Let me try to reexplain though just to make sure.
Am I understanding correctly?
Definitely on our roadmap, and we'd like to Open-source it and make it a community item.
The hardest part, to be honest, is determining the input/outputs of these CmdLets. If you'd be willing to help us think these out, we'd love your help in designing it and making this a reality :)
Definitely! This was one of the usecases of rafts :)
Currently, Git repositories are the way to go, but we plan to support Universal-package Based Rafts as well.
Yes, ProGet supports this.
However, you will need a paid version of Chocolatey in order to do that
https://chocolatey.org/faq#i-would-like-to-be-able-to-offer-my-non-admin-desktop-users-an-option-for-self-service-type-of-installations
I spoke with the engineering team briefly on the issue, and here's some quick feedback.
If we support "double-hop" authentication in the Inedo Agent, we would implement that functionality ourselves, using our own secure channels to ensure end-to-end security. Fortunately that's relatively easy to do, since we control both ends of the pipe.
This is not the case in PowerShell, and we cannot bypass PowerShell's security measures. PowerShell Remoting (and Kerberos) already has the concept of "double hob" authentication. It's very complicated, so we'd encourage you to read this article to understand the challenges with it.
So, long story short, if we implement a relay service ("double-hop authentication") in the Inedo agent, we may be able to use the same UI to control PowerShell, but there is a significant amount of configuration and control required to get it to work.
As far as the concept of "collectors", we will be researching a major new feature to Otter that simplifies the collection of like you're describing. So we'll consider it in there
Just to close this post: this was fixed in InedoCore 1.0.12, after diagnosing the issue on a ticket (EDO-5683).
Are you saying you can't create a job from a template?
The main purpose of the template is:
Instead of specifying the plan and server targeting each time you create a job, you can create a job template that will specify these for you. The template can also specify variables that will need to be entered when the job is run.
While this can save you a few steps when running jobs, the biggest benefit of job templates is that, once defined, you can now trigger jobs using a the job trigger API. This will allow you to use nearly any other tool, such as Jenkins or a PowerShell script, to trigger an Otter orchestration job.
However, we plan to expand it to be a lot more useful with better variable selection, etc.
Did the changes from OT-284 help at all? That should have drastically sped up the servers overview page at least. It was in Otter 2.1.3.