@stevedennis I understand not wanting to go in blind on something so low-level. For what it is worth, the monkey-patch I applied has been stable, so far, in my lab environment.
Let me know if you need me to test a build, prior to Otter 2025.
@stevedennis I understand not wanting to go in blind on something so low-level. For what it is worth, the monkey-patch I applied has been stable, so far, in my lab environment.
Let me know if you need me to test a build, prior to Otter 2025.
@dean-houston said in PSEval can be called as $PSEval, @PSEval or %PSEval, but null/empty returns only make sense for $PSEval:
a variable prefix (
$
,@
,%
) is more of a convenience/convention, and the prefix isn't really available in any useful context. I'm almost certain you can do stuff like$MyVar = @(1,2,3)
for example.
For what it is worth, if this is the intent, then it does not match what actually occurs. The execution engine throws exceptions when you mismatch the variable types:
# mixed sigils
{
set $ok = ""; set $no = "";
try { set $a = "blah"; set $ok = $ok: scalar; } catch { set $no = $no: scalar; force normal; }
try { set $b = @(1,2,3); set $ok = $ok: vector-as-scalar; } catch { set $no = $no: vector-as-scalar; force normal; }
try { set $c = %(a: 1, b: 2); set $ok = $ok: map-as-scalar; } catch { set $no = $no: map-as-scalar; force normal; }
try { set @d = "blah"; set $ok = $ok: scalar-as-vector; } catch { set $no = $no: scalar-as-vector; force normal; }
try { set @e = @(1,2,3); set $ok = $ok: vector; } catch { set $no = $no: vector; force normal; }
try { set @f = %(a: 1, b: 2); set $ok = $ok: map-as-vector; } catch { set $no = $no: map-as-vector; force normal; }
try { set %g = "blah"; set $ok = $ok: scalar-as-map; } catch { set $no = $no: scalar-as-map; force normal; }
try { set %h = @(1,2,3); set $ok = $ok: vector-as-map; } catch { set $no = $no: vector-as-map; force normal; }
try { set %i = %(a: 1, b: 2); set $ok = $ok: map; } catch { set $no = $no: map; force normal; }
Log-Information Mixed sigils: success${ok}, fail${no};
}
DEBUG: Beginning execution run...
ERROR: Unhandled exception: System.ArgumentException: Cannot assign a Vector value to a Scalar variable.
at Inedo.ExecutionEngine.Executer.ExecuterThread.InitializeVariable(RuntimeVariableName name, RuntimeValue value, VariableAssignmentMode mode)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteAsync(AssignVariableStatement assignVariableStatement)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteNextAsync()
ERROR: Unhandled exception: System.ArgumentException: Cannot assign a Map value to a Scalar variable.
at Inedo.ExecutionEngine.Executer.ExecuterThread.InitializeVariable(RuntimeVariableName name, RuntimeValue value, VariableAssignmentMode mode)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteAsync(AssignVariableStatement assignVariableStatement)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteNextAsync()
ERROR: Unhandled exception: System.ArgumentException: Cannot assign a Scalar value to a Vector variable.
at Inedo.ExecutionEngine.Executer.ExecuterThread.InitializeVariable(RuntimeVariableName name, RuntimeValue value, VariableAssignmentMode mode)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteAsync(AssignVariableStatement assignVariableStatement)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteNextAsync()
ERROR: Unhandled exception: System.ArgumentException: Cannot assign a Map value to a Vector variable.
at Inedo.ExecutionEngine.Executer.ExecuterThread.InitializeVariable(RuntimeVariableName name, RuntimeValue value, VariableAssignmentMode mode)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteAsync(AssignVariableStatement assignVariableStatement)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteNextAsync()
ERROR: Unhandled exception: System.ArgumentException: Cannot assign a Scalar value to a Map variable.
at Inedo.ExecutionEngine.Executer.ExecuterThread.InitializeVariable(RuntimeVariableName name, RuntimeValue value, VariableAssignmentMode mode)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteAsync(AssignVariableStatement assignVariableStatement)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteNextAsync()
ERROR: Unhandled exception: System.ArgumentException: Cannot assign a Vector value to a Map variable.
at Inedo.ExecutionEngine.Executer.ExecuterThread.InitializeVariable(RuntimeVariableName name, RuntimeValue value, VariableAssignmentMode mode)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteAsync(AssignVariableStatement assignVariableStatement)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteNextAsync()
INFO : Mixed sigils: success: scalar: vector: map, fail: vector-as-scalar: map-as-scalar: scalar-as-vector: map-as-vector: scalar-as-map: vector-as-map
There are also syntax elements which require specific context, such as foreach
requiring a @vec
(Iteration source must be a vector value).
I can certainly understand why you would not want to update the base classes so they provide the context (scalar, vector, map) to implementations but I expect it is probably the safest solution for maintaining backwards compatibility with existing authored scripts (if that information is available to you at parse-time).
The alternative is to let the script author pass it along as an optional property to $PSEval()
(similar to $GetVariableValue()
), but this introduces another character which would then need to be escaped within the embedded Powershell (i.e. ,
), and that probably would break authored scripts.
PSExec
(i.e.Execute-Powershell
) can capture variables, but not output streams.
Understood with regards to output stream.
However, no joy with this (i.e. capturing variables) either...
set $In = 12345;
set $Out = '<unset>';
Execute-PowerShell
(
Text: >-|>
Write-Verbose "Got some input: $In here...";
$Out = "This is nice";
Write-Verbose "inside the script, we have: $Out";
>-|>,
Verbose: true
);
Log-Information Outside the script, we have $Out;
DEBUG: Using Windows PowerShell 5.1...
DEBUG: Importing Out...
DEBUG: Importing In...
DEBUG: Got some input: 12345 here...
DEBUG: inside the script, we have: This is nice
INFO : Outside the script, we have <unset>
So something like this:
set $hello = world; $PSExec >> $hello = 'dears'; >>; Log-Information Hello $hello;
Calling $PSExec
like this throws Unexpected token $. I suspect you meant to write PSExec
(the operation) or $PSEval
(the variable function).
However, substituting PSExec >>
does not capture the output variable.
set $hello = world;
PSExec >>
$hello = 'dears';
>>;
Log-Information 1: Hello $hello;
DEBUG Using Windows PowerShell 5.1...
DEBUG Importing hello...
INFO 1: Hello world
Substituting set $x = $PSEval(>>...>>)
does not even execute: The term '>>' is not recognized as the name of a cmdlet, function, script file, or operable program
set $hello = world;
set $x = $PSEval(>>
$hello = 'dears';
>>);
Log-Information 2: Hello $hello;
DEBUG: Using Windows PowerShell 5.1...
ERROR: The term '>>' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ERROR: The term 'world' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ERROR: The term '>>' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ERROR: PSEVal: PowerShell script failed with an error (see previous log messages).
set $hello = world;
set $x = $PSEval(>>
`$hello = 'dears';
>>);
Log-Information 3: Hello $hello;
DEBUG: Using Windows PowerShell 5.1...
DEBUG: Importing hello...
ERROR: The term '>>' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ERROR: The term '>>' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ERROR: PSEVal: PowerShell script failed with an error (see previous log messages).
set $hello = world;
set $x = $PSEval(>>`$hello = 'dears';>>);
Log-Information 4: Hello $hello;
DEBUG: Using Windows PowerShell 5.1...
DEBUG: Importing hello...
ERROR: The term '>>' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ERROR: The term '>>' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ERROR: PSEVal: PowerShell script failed with an error (see previous log messages).
Substituting set $x = $PSEval("...")
does not capture the output variable:
set $hello = world;
set $x = $PSEval("
$hello = 'dears';
");
Log-Information 5: Hello $hello;
DEBUG: Using Windows PowerShell 5.1...
DEBUG: Importing hello...
INFO : 5: Hello world
set $hello = world;
set $x = $PSEval("
`$hello = 'dears';
");
Log-Information 6: Hello $hello;
DEBUG: Using Windows PowerShell 5.1...
DEBUG: Importing hello...
INFO : 6: Hello world
I think this might actually make my point for me. If you can get this wrong instinctively, what hope do I have (or my less code-savvy colleagues)?
@dean-houston I hadn't noticed Execute-Powershell
automatically capturing output variables. I'll do some more tests and see if it works for me.
This is probably a reasonable candidate for the /literal
proposal described in the documentation, if you are still taking votes on that:
We're considering adding this using a /literal decorator at the end of a quoted or swim string. For example,
"Not $escaped"/literal
.If you have any interest in this, please submit a ticket with how you would plan to use this, and how it would be helpful; this is currently tracked as IEE-20, and we will link your ticket and seek to prioritize this.
@dean-houston I'm sure that fixing the syntax would not be straightforward, but introducing @ListSet
and %MapSet
variable functions in lieu is probably a good enough workaround for nearly all use cases where someone would want to do this.
There does not appear to be a means to capture the output stream from Execute-Powershell
, nor does it appear possible to obtain the value of a variable set inside the Powershell script back to the enclosing OtterScript.
It is possible to get a result back from $PSEval()
but the options to both $PSEval
and Execute-Powershell
are not equivalent. For instance:
Execute-Powershell
uses a Text
property which does not interpolate OtterScript variables at the script level (it extracts and dispatches them), which means it better handles the $variables
that are specific to the Powershell script$PSEval()
does interpolate OtterScript variables, which means all the variables in the Powershell script have to be backtick-escaped, whether they are specific to the Powershell script or defined in the OtterScript context$PSEval()
does not like parentheses; many of these have to be escaped as well, and it is not always clear which ones$PSEval()
does not like newlines; these can be within "swim" strings, but these still require escaping at least the variables$PSEval()
is not (currently) particularly supportive of scripts with varying output (see #4920).Correctly escaping all the parentheses and variables in any Powershell longer than a couple of lines is an exercise in torture.
Execute-Powershell
is clearly the better choice for more complex scripts, but seems to lack the means to return anything back to the caller.
As such, could Execute-Powershell
be at least augmented with an output parameter, to capture anything in the Powershell output stream back to target variable? (noting it should be made aware of scalar, vector or map context)
Alternatively/additionally, it would be useful for Execute-Powershell
to export variables back to the calling OtterScript context. Clobbering existing variables across the board might not be the right approach (so as not to break existing scripts), but perhaps Execute-Powershell
could be given an optional input parameter which is a list of variable names to export, so the capture is opt-in?
(I assume this was the intent of including them in ExecutePowerShellJob+Result, and that Result
is correctly populated...)
# context aware output..
Execute-Powershell (
Text: >>
Write-Output "abc"
>>,
OutputVariable => $foo
);
Execute-Powershell (
Text: >>
Write-Output "abc"
Write-Output "def"
Write-Output "ghi"
>>,
OutputVariable => @bar
);
Execute-Powershell (
Text: >>
Write-Output @{a = 1; b = 2; c = 3}
>>,
OutputVariable => %baz
);
# or, with capture...
Execute-Powershell (
Text: >>
$a = 10 + 5
$b = @($a, 10)
$c = @{a = $a; b = $b}
>>,
CaptureVariables: @(a, b, c)
);
Log-Information $a; # -> 15
Log-Information $ToJson(@b); # -> ["15", "10"]
Log-Information $ToJson(%c); # -> { "a": "15", "b": ["15", "10"] }
PS: various crimes against humanity, trying to escape properly are below, to demonstrate the difficulties...
set $foo = "hello 'world'";
set @bar = @();
# 'natural' approach, newlines, no escapes: (otter compile error: 'Expected ;')
{
set @a = @PSEval(
for ($i = 1; $i -le 5; $i++) {
Write-Output ('{0} {1}' -f $foo,$i)
}
);
Log-Information `@a: $ToJson(@a);
}
# flatten newlines, no escapes: (otter runtime error: 'cannot resolve variable $i')
set @a = @PSEval(for ($i = 1; $i -le 5; $i++) { Write-Output ('{0} {1}' -f $foo,$i) });
# flatten newlines, no escapes: (otter runtime error: 'invalid use of vector expression in scalar')
set $a = $PSEval(for ($i = 1; $i -le 5; $i++) { Write-Output ('{0} {1}' -f $foo,$i) });
# flatten newlines, escape parens: (otter runtime error: 'cannot resolve variable $i')
set @a = @PSEval(for `($i = 1; $i -le 5; $i++`) { Write-Output `('{0} {1}' -f $foo,$i`) });
# flatten newlines, escape Powershell sigils and parens: ($foo is interpolated, not captured; Powershell syntax error at '-f hello')
set @a = @PSEval(for `(`$i = 1; `$i -le 5; `$i++`) { Write-Output `('{0} {1}' -f $foo,`$i`) });
# flatten newlines, escape all sigils and parens: ($foo is captured; powershell runtime error: 'missing closing ")"')
set @a = @PSEval(for (`$i = 1; `$i -le 5; `$i++`) { Write-Output ('{0} {1}' -f `$foo,`$i`) });
# flatten newlines, escape all sigils and parens: ($foo is captured; powershell runtime error: 'missing closing ")"')
set @a = @PSEval(for `(`$i = 1; `$i -le 5; `$i++`) { Write-Output `('{0`} {1`}' -f `$foo,`$i`) });
# as a swim string; same as above: ('cannot resolve $i' when unescaped; 'missing closing ")"' when escaped)
set @a = @PSEval(>-|>
for ($i = 1; $i -le 5; $i++) {
Write-Output ('{0} {1}' -f $foo,$i)
}
>-|>);
# as a variable, loaded by swim string; escaping all sigils: (captures $foo, this is the first one that works)
set $ps = >-|>
for (`$i = 1; `$i -le 5; `$i++) {
Write-Output ('{0} {1}' -f `$foo,`$i)
}
>-|>;
set @a = @PSEval($ps);
# executes with the 'natural' syntactic approach; captures $foo but has no means to return output
Execute-Powershell(
Text: >-|>
for ($i = 1; $i -le 5; $i++) {
Write-Output ('{0} {1}' -f $foo,$i)
}
>-|>
);
# similarly 'natural'; captures $foo and $bar, but does not populate @bar
Execute-Powershell(
Text: >-|>
$bar = @()
for ($i = 1; $i -le 5; $i++) {
$bar += ('{0} {1}' -f $foo,$i)
}
>-|>
);
(Reposted from Github on request)
Consider the arbitrary OtterScript...
set @fileLines = @PSEval(Get-Content -LiteralPath 'X:\PathTo\File')
if $ListCount(@fileLines) != 0 {
Log-Debug "OK"
}
If the Get-Content
PowerShell cmdlet returns no lines, then the current approach is to attempt to set the return value of @PSEval
to be an empty string, rather than an empty array. This in turn causes Otter to throw a rather nasty exception:
Unhandled exception: System.ArgumentException: Cannot assign a Scalar value to a Vector variable.
at Inedo.ExecutionEngine.Executer.ExecuterThread.InitializeVariable(RuntimeVariableName name, RuntimeValue value, VariableAssignmentMode mode)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteAsync(AssignVariableStatement assignVariableStatement)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteNextAsync()
From an end-user perspective, trying to resolve whether a PSEval
would return scalar, vector or hash is therefore fraught -- you have to know in advance whether the script will return 0, 1 or many items, and cannot safely deduce this by simply calling in vector context and testing the resulting length.
The best I can think of is try { set global @r = @PSEval(...); } catch { set global @r = @(); }
, which technically proceeds, but the exception is still thrown and the overall job still results in an error state (assuming no subsequent force normal
).
Ideally, PSEval
should behave appropriately on a null/empty value, depending on whether it was called as $PSEval
, @PSEval
or %PSEval
-- in scalar context, it is fine to return the empty string; in vector context, it would be better if it returned an empty vector; in hash context, an empty hash.
I appreciate that this might not be resolvable straight away, as you probably need assistance from whatever parses and invokes the script line upstream (maybe ExecuterThread
?) to pass the desired RuntimeValueType
to you. I expect it either needs to be suppled by extending IVariableFunctionContext
(which may have side-effects on implementors of that interface); or—more likely—be introduced as a protected internal
settable property of VariableFunction
itself.
Alternatively, if it is possible to alter RuntimeValue
so it has a dedicated constructor/static sentinel value for empty values, ExecuterThread.InitializeVariable
might be better-placed to handle the scalar/vector/hash decision, and you can just return the sentinel from EvaluateAsync
.
(Reposted from Github on request)
It does not appear to be possible to set the value of a list or map element, where the index or key is stored in a variable.
The formal grammar suggests...
<assign_variable_statement> ::= set [local | global] ( /variable_expression/ | /indexed_expression/ ) = /literal_expression/;
variable_expression
:
A variable type identifier ($
,@
, or%
) immediately followed by one of:
- simple name - follows same rules as any_name
- explicit name - a left curly brace (
{
), followed by of characters with the same rules as any_name but that also allow spaces, followed by a right curly brace (}
)
indexed_expression
:
A variable_expression for a vector (@) or map (%) type, immediately followed one of:
- left bracket (
[
), scalar_expression, then right bracket (]
)- dot (
.
) then scalar_expression
...which implies that both...
set %Map.$k = something;
set @Vec[$i] = something;
...should be possible, but these throw various errors at the execution engine level, e.g.:
Unhandled exception: System.InvalidCastException: Unable to cast object of type 'System.ArrayEnumerator' to type 'System.Collections.Generic.IEnumerator`1[Inedo.ExecutionEngine.RuntimeValue]'.
at Inedo.ExecutionEngine.RuntimeListValue.GetEnumerator()
at Inedo.ExecutionEngine.Mapping.CoreScriptPropertyMapper.CoerceValue(RuntimeValue value, PropertyInfo property, Type type)
at Inedo.Otter.Service.PlanExecuter.ExecutionVariableEvaluationContext.GetVariableFunctionInternal(RuntimeVariableName functionName, IList`1 arguments)
at Inedo.Otter.Service.PlanExecuter.ExecutionVariableEvaluationContext.TryEvaluateFunctionAsync(RuntimeVariableName functionName, IList`1 arguments)
at Inedo.ExecutionEngine.Variables.FunctionTextValue.EvaluateAsync(IVariableEvaluationContext context)
at Inedo.ExecutionEngine.Variables.ProcessedString.EvaluateValueAsync(IVariableEvaluationContext context)
at Inedo.ExecutionEngine.Executer.ExecuterThread.EvaluateAsync(EqualityPredicate equalityPredicate)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteAsync(PredicateStatement predicateStatement)
at Inedo.ExecutionEngine.Executer.ExecuterThread.ExecuteNextAsync()
In the absence of syntax support, the sanest workaround is probably a pair of variable functions -- e.g. @ListSet(@Vec, $i, value)
and %MapSet(%Map, $k, value)
, which could perform this operation.
My current workaround is to...
set %Map = %MapAdd(%MapRemove(%Map, $key), $key, value);
set @copy = @();
foreach $j in @Range(0, $ListCount(@Vec)) {
if ($i == $j) { set @copy = @ListInsert(@copy, value); }
else { set @copy = @ListInsert(@copy, $($ListItem(@Vec, $j))); }
}
set @Vec = @copy;
...the latter of which is so tortuous I've probably got it wrong just typing it in here (and does not handle anything other than lists of scalars).
Looking at the existing similar %MapAdd
and @ListInsert
functions, I expect the meat of these is probably fairly straightforward:
protected override IEnumerable EvaluateVector(IVariableFunctionContext context)
{
var list = this.List.ToList();
var index = this.Index;
// bounds checking
if (index >= list.Count)
{
// allow for growing the list to fit new index
list.AddRange(Enumerable.Range(0, 1+index-list.Count).Select(_ => string.Empty));
}
else if (index < 0)
{
// allow for negative indexing from end of array (but not growth)
if (-index >= list.Count) throw new ArgumentOutOfRangeException(nameof(this.Index));
index = list.Count + (index % list.Count);
}
list[index] = this.Value;
return list;
}
public override RuntimeValue Evaluate(IVariableFunctionContext context)
{
if (String.IsNullOrEmpty(this.Key)) throw new ArgumentNullException(nameof(this.Key));
var map = new Dictionary<string, RuntimeValue>(this.Map);
map[this.Key] = this.Value;
return new RuntimeValue(map);
}
(Reposted from GitHub on request)
I've been trying to understand how OSCall
is supposed to work, so I can reuse some common OtterScript snippets across jobs.
However, after successfully calling a script with an input variable, I am trying to log its output variable, and am receiving the exception "An unhandled error occurred during execution phase: Log scope Execution has already been completed.. See the error logs for more details."
I can find no more details.
Using a minimal example, the common/inner/child script I am trying to OSCall
is:
set $Result = Hello $Username;
Log-Information in child script: $Result;
The calling/outer/parent script is:
set $Result = "<not set>";
OSCall(
Name: Spikes/Hello.otter,
Variables: %(Username: fred),
OutputVariables: @(Result)
);
Log-Information in parent script: $Result;
(I don't know if it is required to initialize $Result
in the outer scope or whether it would be set automatically in the parent; nor whether this was even the correct usage of the OutputVariables
parameter -- these were the things I was testing. The documentation on OSCall
is extremely light on detail.)
The resulting execution log is:
DEBUG: Job will be run against servers sequentially (not asynchronously).
DEBUG: No servers, server roles, or environments specified, and thus no servers will be targeted.
DEBUG: Beginning execution run...
DEBUG: Beginning execution run...
INFO: in child script: Hello fred
INFO: Execution run succeeded.
ERROR: An unhandled error occurred during execution phase: Log scope Execution has already been completed.. See the error logs for more details.
I can see from the log that it at least enters the child script and accepts the input variable (as evidenced by the in child script
message), but it does not complete the subsequent log statement in the parent scope after the OSCall
.
If I comment out either the OSCall
or the Log-Information
in the parent script, the error does not occur.
If I try the slightly more complicated parent script...
for server localhost {
set $Result = "<not set>";
Log-Information before oscall: $Result;
Create-File
(
Name: "C:\ProgramData\InedoOutput.1.txt",
Text: "in parent script: $Result",
Overwrite: true
);
OSCall(
Name: Spikes/Hello.otter,
Variables: %(Username: fred),
OutputVariables: @(Result)
);
Create-File
(
Name: "C:\ProgramData\InedoOutput.2.txt",
Text: "in parent script: $Result",
Overwrite: true
);
Log-Information in parent script: $Result;
}
...I don't get the Log Scope Execution... message, but I do get Execution run failed, and only the first file (InedoOutput.1.txt) is created:
DEBUG: Job will be run against servers sequentially (not asynchronously).
DEBUG: No servers, server roles, or environments specified, and thus no servers will be targeted.
DEBUG: Beginning execution run...
INFO: before oscall: <not set>
INFO: Creating file...
DEBUG: Creating directories for C:\ProgramData\InedoOutput.1.txt...
DEBUG: Creating C:\ProgramData\InedoOutput.1.txt...
INFO: C:\ProgramData\InedoOutput.1.txt file created.
DEBUG: Beginning execution run...
INFO: in child script: Hello fred
DEBUG: Execution run succeeded.
DEBUG: Cleaning up temporary files on Local Server...
ERROR: Execution run failed.
DEBUG: Cleaning up temporary files on Local Server...
It seems nothing will run after an OSCall
...?
OK, I've tried, but I'm getting nowhere.
I've added a very rudimentary implementation of DynamicListVariableType
, based on your code. I've used Inedo.SDK.GetServers(true);
rather than linking directly to Otter.Core, but I think that might be irrelevant.
Custom variable types just seem not to work at all.
Any time you add one to the Job (including trying to add the built-in Universal Packages variable type), the front-end just throws an HTTP/500.
My variable type appears in the list, I select it; I give it a variable name and set the list properties (restrict, multi-select, etc); I click Save Variable and an dialog-style iframe pops up with the standard error page. The new variable's JSON is never written to the raft file.
An error occurred in the web application: Value cannot be null. (Parameter 'type')
URL: http://172.31.15.125:8626/jobs/templates/edit-variable?templateId=Default%3A%3AJobTemplate%3A%3AInitialization%2FRun Sysprep for target servers
Referrer: http://172.31.15.125:8626/jobs/templates/edit-variable?templateId=Default%3A%3AJobTemplate%3A%3AInitialization%2FRun Sysprep for target servers
User: Admin
User Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:133.0) Gecko/20100101 Firefox/133.0
Stack trace: at System.ArgumentNullException.Throw(String paramName)
at System.Activator.CreateInstance(Type type, Boolean nonPublic, Boolean wrapExceptions)
at Inedo.Otter.WebApplication.Pages.Jobs.JobTemplates.EditJobTemplateVariablePage.<>c__DisplayClass5_0.<CreateChildControls>b__1()
at Inedo.Web.Controls.ButtonLinks.PostBackButtonLink.Inedo.Web.Controls.ISimpleEventProcessor.ProcessEventAsync(String eventName, String eventArgument)
at Inedo.Web.PageFree.SimplePageBase.ExecutePageLifeCycleAsync()
at Inedo.Web.PageFree.SimplePageBase.ProcessRequestAsync(AhHttpContext context)
at Inedo.Web.AhWebMiddleware.InvokeAsync(HttpContext context)
::HTTP Error on 20/01/2025 18:46:39::
Attaching a debugger traces the error to code which decompiles horribly (and very likely does not match the codebase exactly). I won't paste it all here, but it inside the EditJobTemplateVariablePage
, there is a CreateChildControls
method, partially containing:
PostBackButtonLink postBackButtonLink = new PostBackButtonLink("Save Variable", delegate {
// ... omitted for brevity
string selectedValue = CS$<>8__locals1.ddlType.SelectedValue;
VariableTemplateType variableTemplateType;
if (!(selectedValue == "Constant"))
{
if (!(selectedValue == "Text"))
{
if (!(selectedValue == "List"))
{
if (!(selectedValue == "Checkbox"))
{
// *** NEXT LINE THROWS... (parameter 'type' cannot be null) ***
variableTemplateType = (VariableTemplateType)Activator.CreateInstance(Type.GetType(CS$<>8__locals1.ddlType.SelectedValue));
}
else
{
variableTemplateType = VariableTemplateType.Checkbox;
}
}
else
{
variableTemplateType = VariableTemplateType.List;
}
}
else
{
variableTemplateType = VariableTemplateType.Text;
}
}
else
{
variableTemplateType = VariableTemplateType.Constant;
}
// ... omitted for brevity
});
It looks like Type.GetType
call can't resolve the custom VariableTemplateType
class name.
The value of ddlType.SelectedValue
looks (at first glance) to match the correct class and assembly name of my type (ServerSelector.ServerListVariableType, ServerSelector
).
I've also tried to manually enter the variable definition directly into the raw raft content, but I must be missing something in the syntax, because the result just turns red in the /jobs
page (no errors are logged in Diagnostics Centre.).
"JobVariables": [
{
"Name": "TargetServers",
"Description": "Execute against these servers",
"InitialValue": "",
"Type": "ServerSelector.ServerListVariableType, ServerSelector",
"Usage": "Input",
"ListValues": [],
"ListMultiple": true,
"ListRestrict": true
},
...
For what it is worth, my custom class is...
[DisplayName("Specific servers")]
public class ServerListVariableType : DynamicListVariableType
{
[Persistent]
[DisplayName("Include inactive")]
public bool IncludeInactive { get; } = false;
public override async Task<IEnumerable<string>> EnumerateListValuesAsync(VariableTemplateContext context)
{
IEnumerable<string> GetServerNames()
{
foreach (var s in Inedo.SDK.GetServers(IncludeInactive).Select(s => s.Name))
yield return s;
}
return await Task.FromResult(GetServerNames());
}
public override RichDescription GetDescription()
{
return new RichDescription("Allows selection of ",
new Hilite("Servers"),
" outside of Job targeting");
}
}
Thanks @atripp -- I can't believe I missed VariableTemplateType
in the SDK!
Thansk for posting the BuildMaster equivalent -- I'll look to see if I can adapt. As you say, the lack of DB
may make this more challenging.
Out of interest, how resilient are extensions to mismatched versions of InedoSdk and/or Otter.Core? Would I have recompile/redistribute an extension each time Otter is updated?
Regarding the security issue, I'm not entirely certain I agree – perhaps, as the sole administrator / runner, my use case is vastly different from others. Agree to disagree
Maybe, in a multi-user scenario, a valid middle ground might be to consider allowing Job authors to limit the servers that can be for server
'd into, i.e. as a property of the job, rather than cutting off all access to for server
.
Or maybe allowing defining a for server
privilege on servers / roles / environments themselves, so that certain users are allowed to issue that command in jobs scripts, and resolving the permissions at runtime.
(I'm sure you've considered this before.)
Is there any practical reason why an OtterScript script cannot issue a for server
call in a job that does not use custom targeting? The limitation seems arbitrary at first glance.
Say, for example, as part of my orchestration, I want to run a job against a list of servers, or an environment and for each one of those resolved servers I want to run an isolated step on a common ancillary server, I simply can't.
I can either...
For example, I could want to run the following contrived example script against a subset of my machines...
set $hostname = $EnvironmentVariable(COMPUTERNAME);
set $macaddr = $PSEval((Get-NetRoute -DestinationPrefix '0.0.0.0/0' | Sort { $_.RouteMetric + $_.ifMetric } | Select -First 1 | Get-NetAdapter).MacAddress);
for server 'common-dhcp' {
set $nextip = $SHEval(/opt/dhcp/dhcpctl get-next-ip);
Exec(
FileName: /opt/dhcp/dhcpctl
Arguments: set-reservation $macaddr $nextip
);
}
for server 'common-dc' {
Create-File(
Name: "X:\known-servers\${hostname}.txt",
Text: $macaddr
);
}
I could create an input variable of type Text, and manually type the server names into that prompt; then @Split
and $Trim
that value, and hope I got the names right -- not fun for more than a few Servers.
I could create a script-specific role and require that I pre-populate it before running the script. However, I can easily see that leading to mistakes, as the management of the role is separate to the running of the script -- re-running from the History view would be fraught, for example.
I could create an input variable of type List, then compose a separate script, with the sole purpose of peforming some Rafts_GetRaftItems
/ (some horrid JSON manipulation) / Rafts_CreateOrUpdateRaftItem
to periodically update the list with selectable server names -- but this is borderline insanity.
Otter already has a decent server / role / environment selector -- I just lose access to it if my script needs to for server
.
Assuming there really is a good reason why we can't change context in a targeted script, then could we perhaps have a dedicated Input Variable type (e.g. Object List), which could be automatically populated with servers, roles or environments (with single- or multi-select), so we can emulate the existing, user-friendly targeting in for server
scripts?
(I note there seems to be a specific input variable type for Universal Packages, so I assume the concept of custom controls exists. I'd look at maybe extending this myself, but I don't think custom variable types are an extensible point in the SDK.)
Havign attached a debugger to the Otter.Service.exe process in my lab, I am fairly certain that InedoPullAgentServer.HandleDisconnected
(see Inedo.Agents.Client.dll) is never actually fired. This method is the only one I can see that is responsible for removing active connections from the openConnections
collection.
It is supposed to be fired by the PullServerConnection.Disconnected
event (inherited from AgentConnection<T>
) -- at least, when the connection is created by InedoAgentClientListener.CreateConnection
, an event handler is bound which would call HandleDisconnected
, but I can't see anywhere where the base event is ever actually triggered.
To restore stability to my lab system, I have monkey-patched the InedoPullAgentServer.ConnectionEstablisedAsync
method, so that the new connection always overwrites any existing one in the collection. I have tried to clean up the existing connection, but asynchronous C# is not my strongest suit, and I can't see through the multiple layers of indirection to determine if there are any major reasons why Otter shouldn't do this.
The patched method is below, for your review; feel free to use it if it suits your needs:
internal async Task ConnectionEstablishedAsync(PullAgentHostIdentifier hostIdentifier, PullServerConnection connection, CancellationToken cancellationToken)
{
await this.ValidateConnectionAsync(hostIdentifier, connection, cancellationToken).ConfigureAwait(false);
Dictionary<PullAgentHostIdentifier, PullServerConnection> dictionary = this.openConnections;
lock (dictionary)
{
PullServerConnection existingConnection;
if (this.openConnections.TryGetValue(hostIdentifier, out existingConnection) && existingConnection != null)
{
using (existingConnection)
{
try
{
// HACK: provides an exception with valid stack trace to HandleDroppedConnection below
// ASSUMPTION: HandleDroppedConnection records the exception somewhere, e.g. the Diagnostics Centre, so
// needs to be filled with useful info; if not, we can avoid the try/catch
// TODO: is there any more suitable exception here...?
throw new Exception("An existing pull agent connection was abandoned");
}
catch (Exception ex)
{
// NOTE: HandleDisconnected also locks on (dictionary = openConnections), so this has to be
// done within the current thread to avoid deadlocking
// TODO: are there any other side-effects...?
// update the back-end database AgentConnections and clean up the in-memory collection
existingConnection.HandleDroppedConnection(ex);
this.HandleDisconnected(existingConnection);
}
}
}
// ...regardless, we always overwrite any existing entry in the in-memory collection
// (dictionary's indexed setter using InsertionBehavior.Overwrite)
this.openConnections[hostIdentifier] = connection;
}
}
(Note that the above was derived from a decompilation tool, so may not exactly match your existing naming, etc.)
I have an Otter server and a managed device configured using the Listen for Inedo Agent (i.e. a Pull agent).
When the agent is first installed and the server object is first added to Otter, everything works as it should:
16/01/2025 04:38:17: Starting agent connector for otter.lab.local:46336...
16/01/2025 04:38:17 DEBUG: Attempting to establish connection with otter.lab.local:46336...
16/01/2025 04:38:17 DEBUG: Connection established with otter.lab.local:46336.
However, if I reboot the machine that is running the Agent, it never reliably recovers:
16/01/2025 04:50:20: Starting agent connector for otter.lab.local:46336...
16/01/2025 04:50:20 DEBUG: Attempting to establish connection with otter.lab.local:46336...
16/01/2025 04:50:20 DEBUG: Connection established with otter.lab.local:46336.
16/01/2025 04:50:20: Connection to otter.lab.local:46336 dropped.
16/01/2025 04:50:20 DEBUG: Attempting to establish connection with otter.lab.local:46336...
16/01/2025 04:50:20 DEBUG: Connection established with otter.lab.local:46336.
16/01/2025 04:50:20: Connection to otter.lab.local:46336 dropped.
16/01/2025 04:50:20 DEBUG: Attempting to establish connection with otter.lab.local:46336...
16/01/2025 04:50:20 DEBUG: Connection established with otter.lab.local:46336.
16/01/2025 04:50:20: Connection to otter.lab:46336 dropped.
...
If I look at the Agent Listener Dashboard I can see thousands of connections (a SELECT COUNT(1) FROM AgentConnections;
is now up to 33,058 in approx 15 mins), and the Diagnostics Center shows numerous messages:
Error sending handshake response to 172.31.15.123:51530: System.ArgumentException: An item with the same key has already been added. Key: Inedo.Agents.PullAgentHostIdentifier
at System.Collections.Generic.Dictionary`2.TryInsert(TKey key, TValue value, InsertionBehavior behavior)
at Inedo.Agents.InedoPullAgentServer.ConnectionEstablishedAsync(PullAgentHostIdentifier hostIdentifier, PullServerConnection connection, CancellationToken cancellationToken)
at Inedo.Agents.AgentListener`1.ProcessIncomingConnection(TConnection channel)
If I restart the Otter Server service (optionally issuing TRUNCATE TABLE AgentConnections;
beforehand), everything calms down again -- at least until the machine running the Agent reboots again (or the Agent service restarts).
Looking at a decompilation of the .NET, the Server service code appears to be trying to add to an private, in-memory collection of open connections (Inedo.Agents.InedoPullAgentServer.openConnections
) ultimately indexed by what I think is the Agent's secret key (Inedo.Agents.PullAgentHostIdentifier.UniqueKey
). There is some logic that tries to remove from this collection when it detects a disconnection, but I don't know if that logic is actually firing, or if it is subtly incorrect.
It seems to me that, when a connection is re-established after an agent is restarted, it would still always send the same secret key, so this add operation can never succeed.
I am not familiar enough with the logic of why the Server service needs to do this; I can only see that collection being manipulated, never actually queried (but I might not have a complete decompilation).
If there is a tangible reason for why the Server service needs to maintain this list, should it instead be added by openConnections[key] = value
which would overwrite any existing entry for that key (as opposed to .Add(key, value)
, which throws if the key exists); or should PullAgentHostIdentifier
combine and compare more information (e.g. source IP and TCP port) to increase its uniqueness?
(I only have the one agent machine defined in my Otter lab at the moment, so it shouldn't be the case that two agents might exist with the same secret key value.)
Also, during my diagnosis, I can see that there is the intention of a (potentially-configurable) 30-second delay between reconnection attempts (Inedo.Agents.AgentConnectionConfig.ReconnectDelay
), but I can't see it being used anywhere (again, it may be an incomplete decompilation).
I was seeing reconnections as much as 1,000 times per second, so there clearly isn't a delay enforced anywhere else.
As an aside, I also note that the /administration/agent-listener
page in the web application does not let me delete stale connections -- a per-row button exists, but throws a JavaScript error.
(Not that I want to manually click each connection when I have thousands of them to clean up -- a bulk-delete button would be a useful feature here! Some paging on that screen would be a bonus, too )
Is issuing DELETE FROM AgentConnections WHERE ConnectionStatus_Code = 'D';
sufficient for me to clean these up from the view?
The /executions/execution-details
page does not appear to completely encode all user-generated content.
Consider the following OtterScript, which demonstrates a classic script-injection:
Log-Information "<script type='text/javascript'>window.alert('oops!');</script>";
/executions/execution-details
page for that execution;I think this is specific to content that is lazy-loaded into the page (the content of the Execution section only loads when the section is expanded). If Log-Error
is used instead of Log-Information
, the issue does not seem to occur. I think this is because the Execution section is already expanded when there are warning/error messages.
Subsections headers emitted by script in described nested blocks don't seem to be affected either, so...
# <script type='text/javascript'>window.alert('oops!');</script>
{
Log-Information "This should be logged inside a subsection; the subsection header encoding seems fine";
}
...seems not to be an issue, even though the Execution section is not initially expanded.
However, unexpanded messages inside the subsection are still subject to the issue...
# <script type='text/javascript'>window.alert('in comment header');</script>
{
Log-Information "<script type='text/javascript'>window.alert('in info');</script>";
}
For what it is worth, I originally noticed this because I trying to Log-Information
some generated XML content, prior to writing it to a file. The tags of the XML intermingled with the HTML content and broke the rendering.
The <script>
example is easier to demonstrate (and may itself present a wider XSS issue...)
Can confirm 2024.2-rc.1 fixes this specific HTTP/500 error. Thanks for looking.
(@atripp - as soon as I manually changed the feed URL in InedoHub to the one you gave, it offered to update itself from 1.3.12 to 1.4.4, so I did that first. Ironically, 1.3.12 is the version in the documentation page's screenshot, so I didn't consider it might be out of date.)
@dean-houston This might be a really silly question, but what is the URL for the Pre-Release package source feed?
The linked documentation suggests that Package source should be a drop-down list in InedoHub (from which I should select Inedo's Prerelease Feed); mine is a free text field containing https://proget.inedo.com/upack/Products
.
I have an Otter job template for which I have defined a number of custom variables, some of which are marked as Required and some are not.
If I view previous executions of that job in /jobs/history
, there is a small (play) button icon against each execution, for which I believe the intention is to re-run any one particular execution with the variable prompts pre-populated with the previously-entered values for that execution.
If there were no custom variables defined on the template or, for that execution, if all the custom variables were populated with a value, clicking that button works as intended.
If, however, I did not set a value for one of the properties (i.e. it was left as an empty text box), clicking that button results in an HTTP/500 server error. The Diagnostic Centre page yields the following:
An error occurred in the web application: The given key 'MyCustomVariableName' was not present in the dictionary.
URL: http://xxxx:8626/jobs/from-template?jobTemplateId=Default%3A%3AJobTemplate%3A%3AMyTemplateFolder%2FMyTemplateName&jobId=140
Referrer: http://xxxx:8626/jobs/history
User: Admin
User Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:132.0) Gecko/20100101 Firefox/132.0
Stack trace: at System.Collections.Generic.Dictionary`2.get_Item(TKey key)
at Inedo.Otter.WebApplication.Pages.Jobs.CreateJobFromTemplatePage.VariablePrompts..ctor(IEnumerable`1 variables, Dictionary`2 selectedValues, VariableTemplateContext context)
at Inedo.Otter.WebApplication.Pages.Jobs.CreateJobFromTemplatePage.CreateChildControls()
at Inedo.Otter.WebApplication.Pages.OtterSimplePageBase.InitializeAsync()
at Inedo.Web.PageFree.SimplePageBase.ExecutePageLifeCycleAsync()
at Inedo.Web.PageFree.SimplePageBase.ProcessRequestAsync(AhHttpContext context)
at Inedo.Web.AhWebMiddleware.InvokeAsync(HttpContext context)
::HTTP Error on 19/12/2024 18:55:42::
I'm fairly certain it boils down to a simple null-ref check in the CreateJobFromTemplatePage
class (decompiled from Otter.WebApplication.dll
):
public VariablePrompts(IEnumerable<TemplateVariable> variables, Dictionary<string, string> selectedValues, VariableTemplateContext context)
{
object item;
CreateJobFromTemplatePage.VariablePrompts variablePrompt = this;
foreach (TemplateVariable variable in variables)
{
if (this.variableValues.ContainsKey(variable.Name))
{
continue;
}
TemplateVariable templateVariable = variable;
if (selectedValues != null)
{
item = selectedValues[variable.Name]; // <-- here
}
else
{
item = null;
}
if (item == null)
{
item = variable.InitialValue;
}
templateVariable.InitialValue = (string)item;
VariableTemplateInput variableTemplateInput = variable.Type.CreateInput(variable, context);
variablePrompt.get_Controls().Add(variableTemplateInput);
this.variableValues.Add(variable.Name, variableTemplateInput);
}
}
My guess is that you are not serializing unset/empty values or that they are not being deserialized correctly to the selectedValues
dictionary by the caller. Whether or not that is a larger problem is unknown, but you can probably bandage it here by simply changing to...
if (selectedValues != null || !selectedValues.ContainsKey(variable.Name))
{
item = selectedValues[variable.Name];
}
...without affecting the surrounding logic.
I have enabled the listening connection on my Otter install, and am trying to get a second server that has the agent installed to dial home to the Otter server.
The firewall is open, a self-sign certificate has been created on the Otter server, and its thumbprint has been configured in the Otter server's listener config. A public export of the self-signed cert has been installed in the Trusted Roots store on the server with the agent, and the windows dialog claims the certificate chain is therefore OK.
I have created a server object, with the server type of "pull", and pasted the secret key from the object into a Connections/Server
node the InedoAgent.config file on the server. Also, Connections/@Enabled="true"
in that file.
However, the server object in Otter is stuck in the Error state.
The Agent Listener Dashboard shows connections from the server with the agent, every 30s or so. The Diagnostics Centre shows errors in a matching timeframe, with:
Bad handshake from SERVERWITHAGENTIP:52768: System.Security.Authentication.AuthenticationException: Authentication failed, see inner exception. ---> System.ComponentModel.Win32Exception (0x8009030D): The credentials supplied to the package were not recognized at System.Net.SSPIWrapper.AcquireCredentialsHandle(ISSPIInterface secModule, String package, CredentialUse intent, SCHANNEL_CRED* scc) at System.Net.Security.SslStreamPal.AcquireCredentialsHandle(CredentialUse credUsage, SCHANNEL_CRED* secureCredential) at System.Net.Security.SslStreamPal.AcquireCredentialsHandleSchannelCred(SslStreamCertificateContext certificateContext, SslProtocols protocols, EncryptionPolicy policy, Boolean isServer) at System.Net.Security.SslStreamPal.AcquireCredentialsHandle(SslStreamCertificateContext certificateContext, SslProtocols protocols, EncryptionPolicy policy, Boolean isServer) --- End of inner exception stack trace --- at System.Net.Security.SslStreamPal.AcquireCredentialsHandle(SslStreamCertificateContext certificateContext, SslProtocols protocols, EncryptionPolicy policy, Boolean isServer) at System.Net.Security.SecureChannel.AcquireServerCredentials(Byte[]& thumbPrint) at System.Net.Security.SecureChannel.GenerateToken(ReadOnlySpan`1 inputBuffer, Byte[]& output) at System.Net.Security.SecureChannel.NextMessage(ReadOnlySpan`1 incomingBuffer) at System.Net.Security.SslStream.ProcessBlob(Int32 frameSize) at System.Net.Security.SslStream.ReceiveBlobAsync[TIOAdapter](TIOAdapter adapter) at System.Net.Security.SslStream.ForceAuthenticationAsync[TIOAdapter](TIOAdapter adapter, Boolean receiveFirst, Byte[] reAuthenticationData, Boolean isApm) at Inedo.Agents.Connections.PullServerConnection.ReceiveHandshakeAsync(CancellationToken cancellationToken) at Inedo.Agents.AgentListener`1.ProcessIncomingConnection(TConnection channel)
If I set LogFile
in the InedoAgent.config file, I see repeated entries for:
07/06/2023 06:13:44 DEBUG: Attempting to establish connection with OTTERSERVER:46336...
07/06/2023 06:14:14 DEBUG: Attempting to establish connection with OTTERSERVER:46336...
DNS and firewall look fine:
> Test-NetConnection OTTERSERVER -Port 46336
ComputerName : OTTERSERVER
RemoteAddress : OTTERSERVERIP
RemotePort : 46336
InterfaceAlias : Ethernet0
SourceAddress : SERVERWITHAGENTIP
TcpTestSucceeded : True
I can add the standard .NET trace listeners to the InedoAgentService.exe.config, but I'm not sure what I'm looking for in the massive infodump the resulting trace file then contains.
What am I missing?
I am trying to determine if an Otter server object has been created, and am using the recommended (non-native) API to do this.
At the moment, in order to perform this test (at least as per the documentation), I have to issue an /api/infrastructure/servers/list
call and loop through all the results, matching on the Name
property.
As this project grows, this is going to amount to a whole heap of JSON parsing, and the list is only going to get longer and longer. That's going to have a growing impact on the memory and processing time of my scripts, not to mention the log storage (as each API call response appears to be logged).
Instead, I would like to call an API method which returns the given object by name (I believe name is unique within Otter).
I've already tried to see if an undocumented RESTful or RPC-like method already exists, but both...
/api/infrastructure/servers/SERVERNAME
/api/infrastructure/servers/get/SERVERNAME
...return HTTP/400 Invalid action type
(I have also tried RPC-like methods search
and find
, with the same result). I had hoped that one of them might return HTTP/200 and the JSON of the single found object; or HTTP/404 if it did not exist.
Can I please raise the request that something along these lines be considered for the recommended API?
Or if it already exists, could the documentation be updated to include it?
@atripp said in Apply-Template adding unexpected CR newline chars:
In any case, we should probably switch to an enumeration like template operat has (TemplateNewLineMode) -- but a fourth option to the both (None).
Creating a None
value may be risky. It's just that None
is such a "default-sounding" word, that someone might one day try and make it the default, instead of Auto
, and everyone would have to update scripts that worked before.
For Create-File
, I picked a bool
to indicate that you can either attempt to transform the newlines file or not, and named it so that the default (i.e. false
) would keep the current behaviour. That way, it becomes an opt-in parameter so, if anyone is relying on the way Create-File
currently handles newlines, they shouldn't have to change their scripts to retain the old behaviour.
If you standardise the logic for both Apply-Template
and Create-File
to both support a NewLines
property in the same way, having the default value remain Auto
would still mean that the current behaviour is retained without changing the scripts.
Perhaps Binary
would indicate both a non-default status, and that the source isn't transformed...
(Of course, the term Auto
might itself lead someone to expect that binary/non-text source files would not be subject to line-ending alteration -- I don't know how you get around that. Naming things is hard )
@atripp
Thanks for looking.
Without wanting to hold you to a particular date, what is the typical release timeframe?
I just need to work out whether to work around this issue for now, or wait it out until the next release...
It didn't seem right to me either, so I assumed I was missing something obvious.
Am I right in saying that for server
is only expected to work for Custom server targeting, though?
And for server
can take a scalar variable argument, such as...
for server localhost {
Log-Debug "some arbitrary stuff on the Otter server"
set $serverIWantToConfigure = "TheServerIChoseInMyProperty";
for server $serverIWantToConfigure {
Log-Debug "some arbitrary stuff on the server I want to configure"
}
}
...?
Here's the Create-File code, where there seems to be another replacement:
https://github.com/Inedo/inedox-inedocore/blob/master/InedoCore/InedoExtension/Operations/Files/CreateFileOperation.cs#L77
That appears to use the same Regex (or close-enough) as I have proposed for Apply-Template
above, but it always replaces with fileOps.NewLine
, which I assume is server-specific.
If someone genuinely needed to create a file which used a different line-ending than the platform's default, that logic falls short.
Perhaps, instead, you could ratify an additional property which controls this:
+++
[ScriptAlias("RawMode")]
[Description("If true, does not attempt to rewrite newlines")]
public bool RawMode { get; set; }
...then alter the line to gate that newline rewrite behind that property:
---
var text = Regex.Replace(this.Text ?? string.Empty, @"\r?\n", fileOps.NewLine);
+++
var text = this.Text ?? string.Empty;
// OPT: reuse the interned regex from Apply-Template above?
if (!this.RawMode) text = Regex.Replace(text, @"\r?\n", fileOps.NewLine);
You are then doing a further translation, further down at #L92, where you use StreamWriter
to explicitly set the newline for Linux operations.
If fileOps.NewLine
is already abstracting this, it may not actually be necessary but, in any case, I don't think it does what you think it will, because you only call writer.WriteAsync(text)
and not writer.WriteLineAsync(text)
.
I think the StreamWriter.NewLine
property only applies when WriteLine
/WriteLineAsync
are called, and only to indicate the char added to the end of the string in each call -- it doesn't replace existing newlines in the supplied string.
I suspect, therefore, the line at L92 can therefore become:
using var stream = await linuxFileOps.OpenFileAsync(path, FileMode.Create, FileAccess.Write, Extensions.PosixFileMode.FromDecimal(mode.Value).OctalValue);
--- using var writer = new StreamWriter(stream, InedoLib.UTF8Encoding) { NewLine = linuxFileOps.NewLine };
+++ using var writer = new StreamWriter(stream, InedoLib.UTF8Encoding);
await writer.WriteAsync(text);
I wonder if this is the issue in Apply-Template?
https://github.com/Inedo/inedox-inedocore/blob/master/InedoCore/InedoExtension/Operations/General/ApplyTemplateOperation.cs#L90
At first glance, it would certainly appear to be that line which creates the \r
characters I am seeing.
I'm guessing the original string must have \r\n in it or something.
I assume that >>swim strings>>
which contain newlines persist them as \r\n
, but that might be platform-specific. If I can therefore also assume that the only supported platforms are Windows and Linux (so \r\n
or \n
), then it might be enough to replace that line with:
---
if (this.NewLineMode == TemplateNewLineMode.Windows)
result = result.Replace("\n", "\r\n");
+++
string targetNewline;
switch (this.NewLineMode)
{
case TemplateNewLineMode.Windows:
targetNewline = "\r\n";
break;
case TemplateNewLineMode.Linux:
targetNewline = "\n";
break;
case TemplateNewLineMode.Auto:
auto: // jump here after warning, if not handled
targetNewline = Environment.NewLine; // should this be fileOps.NewLine...?
break;
default:
this.LogWarning($"unsupported NewLine value '{this.NewLineMode}'; Auto assumed");
goto auto; // jump to Auto case
}
// TODO: intern this in a singleton utility class?
var newlineSearcher = new Regex(@"
(?> # atomic capture and discard matched
\r? # optional \r char
\n) # definitive \n char
", RegexOptions.Multiline
| RegexOptions.Compiled
| RegexOptions.IgnorePatternWhitespace);
result = newlineSearcher.Replace(result, targetNewline);
There is another replacement happening at https://github.com/Inedo/inedox-inedocore/blob/master/InedoCore/InedoExtension/Operations/General/ApplyTemplateOperation.cs#L97 which I don't think would be needed, if the above is applied...
---
var fileOps = await context.Agent.GetServiceAsync<IFileOperationsExecuter>().ConfigureAwait(false);
if (this.NewLineMode == TemplateNewLineMode.Auto)
result = result.Replace("\n", fileOps.NewLine);
...but it may be appropriate to hoist the instantiation of fileOps
and use fileOps.NewLine
in place of Envrionment.NewLine
in the switch
above (I assume that fileOps.NewLine
uses the platform of the current for server
, rather than the platform on which Otter is running).
@atripp said in Basic arithmetic in OtterScript:
You're right, you would have to switch to localhost; however, you wouldn't have to "switch back"
But anytime I try to switch anywhere with for server
, I am met with: Server context switching is not allowed on this plan execution.
I can only get for server
to work if I run an ad-hoc job with Custom server targeting, but if I use ad-hoc jobs, I can't prompt for variables.
I can only prompt for variables with a job template, but the option for Custom server targeting is not available in job templates.
So switching is indeed a pain, regardless of direction.
I am experiencing an unexpected bug in the following minimum-repro OtterScript:
set $literal = >-@>
this is line 1
this is line 2
this is line 3
>-@>;
Apply-Template(
Literal: $literal,
OutputFile: $PathCombine($SpecialWindowsPath(CommonApplicationData), none.txt)
);
Apply-Template(
Literal: $literal,
NewLines: Auto,
OutputFile: $PathCombine($SpecialWindowsPath(CommonApplicationData), auto.txt)
);
Apply-Template(
Literal: $literal,
NewLines: Windows,
OutputFile: $PathCombine($SpecialWindowsPath(CommonApplicationData), windows.txt)
);
Apply-Template(
Literal: $literal,
NewLines: Linux,
OutputFile: $PathCombine($SpecialWindowsPath(CommonApplicationData), linux.txt)
);
When run against localhost (a Windows server), the resulting files appear to be created with additonal \r
chars in between each line:
I was expecting that, for NewLines: Windows
, the byte-sequence for line-endings would be \r\n
and for NewLines: Linux
, the byte-sequence would be \n
. Instead, I appear to have \r\r\n
and \r\n
.
(I assume not specifying is the same as Auto
, and matches the Windows
behaviour because of the target server's platform. I have not tried against a Linux target server.)
Note that I am not referring to the ones at the top and bottom of the resulting files -- those are clearly because I have written newlines before and after my fish-quotes. Nor the spaces at the start of each line, those also exist in the fish-string.
The script was entered via the /scripts/edit2
text editor rather than /osve
.
Have I misunderstood some nuance of the Apply-Template
function, or fish-strings in general, or is this a bug?
PS: I thought I might work around this with...
Apply-Template
(
OutputVariable => $out,
Literal: $literal,
NewLines: Windows
);
set $linux_out = $RegexReplace($out, "[\r]+\n", "`n");
Log-Debug $linux_out;
set $win_out = $RegexReplace($out, "[\r]+\n", "`r`n");
Log-Debug $win_out;
Create-File
(
Name: $PathCombine($SpecialWindowsPath(CommonApplicationData), linux_pp.txt),
Text: $linux_out,
Overwrite: true
);
Create-File
(
Name: $PathCombine($SpecialWindowsPath(CommonApplicationData), windows_pp.txt),
Text: $win_out,
Overwrite: true
);
...but it appears that Create-File
has its own nuance, and all were turned into \r\n
newlines. I assume, therefore, it is not possible to use Create-File
to create a raw file containing the exact content of a variable, with no pre-processing?
Version 2022.10 (Build 1)
@atripp said in Basic arithmetic in OtterScript:
There isn't any noticeable overhead.
This assumes I am orchestrating a system which can run PowerShell.
As it stands, to perform the $PSEval
, I'd first have to for server
to a utility server (e.g. localhost) to run the PowerShell, then for server
back to the server I was actually orchestrating.
Or I'd have to $SHEval
a call to something which can do maths, which is guaranteed to be on the target device (possibly bc
, if POSIX) and capture stdout to get the result of the calculation.
Rudimentary support for basic expressions would avoid that rather complex call, hence the feature request.
@apxltd; apologies for the delay; I did not get a notification.
Essentially, it is a chicken-and-egg problem.
Enabling key-based, non-interactive login was one of the things I was hoping to automate/remediate with Otter; instead it is currently a prerequisite to using Otter.
The device is not a 'server' per se, but a third-party 'appliance' built on top of Linux. The device image itself comes prebuilt with keyboard-interactive auth (not password) enabled.
Having Otter support keyboard-interactive seemed more beneficial to a wider audience, than trying to alter the appliance software.
Does OtterScript have any rudimentary arithmetic, beyond $Increment
and $Decrement
, or is it reserved only for specific statements?
I was trying what I thought was a basic array lookup:
set @parts = $Split($FullPath, "/");
set @basename = $ListItem(@parts, $ListCount(@parts) - 1);
...and was told Cannot convert property "Index" value "3 - 1" to Int32 type.
.
I assume that means $ListCount(@parts)
was resolved to 3
and 3 - 1
could not be used as the second parameter to $ListItem
.
Subsequently...
set @basename = $ListItem(@parts, $Decrement($ListCount(@parts)));
...did seem to work, but what if my arithmetic operation was more complex?
Do I really need to invoke PowerShell to do basic maths? It seems like overkill, especially in the context of a remote server connection.
If that really is the case, could an $Expr(...)
function be considered for the next version of the language, akin to TCL's expr{}
function? e.g. allowing for $Expr((5 + 3) * 4 / 3^0.5)
or $Expr($ListCount(@path) - 1)
If it's not the case, and I've just missed something, could you advise the correct form?
Thanks for replying, and for indicating what you guys use internally. It is a feature I have had to implement myself in a tool we use internally, so I have some familiarity with doing it (at least, in Go).
I believe the equivalent in libssh2 is to call libssh2_userauth_keyboard_interactive_ex
, so I presume the code path would branch based on whether a Use keyboard-interactive method option was set.
That function takes a callback argument which, when invoked, receives pointers to an array of server prompts and a target array of responses. The callback is expected to populate the target array with the responses.
At its simplest, you can probably assume the password prompt is the first in the array, and set the first element in the target array with the configured password -- that should cover most use-cases.
So, not a simple flag, but should be trivial enough, depending on the libssh2 wrapper you are using (your signature is C#?)
If you did want to support more than one prompt, you could allow storing a list of responses against either the server or the credential object, and writing them to the target response array in the order they were stored.
I've been setting up some SSH-based servers in Otter, and have experienced authentication failures, even though I know the username and password to be correct:
Unhandled exception: Inedo.Agents.Ssh.SshException: Authentication failed (username/password)
It took me a while to work out what was going wrong.
In the SSH standard, there are two types of "password" authentication:
It appears the SSH agent in Otter only supports true "password" authentication, but many users' expectation of using a login that is username/password-based might also fall under the bracket of "keyboard-interactive". Indeed many distributions actually disable true "password" authentication and enable "keyboard-interactive" by default.
Obviously, the correct way to work around this is to use public/private key based authentication but, as a feature request, could keyboard-interactive authentication also be supported?
Note that I am not suggesting that the end-user should be required to manually enter the password on each job run, just that the Otter SSH client correctly handle if the server's permitted authentication method happens to be keyboard-interactive.
Both "password" and "keyboard-interactive" ultimately resolve when the SSH client sends a password string so, if your SSH client library supports sending a keyboard-interactive authentication, can you expose that option?
At its simplest, I imagine this could be a tickbox option (e.g. Use keyboard-interactive method for passwords), when editing an SSH-based server. but I suppose you could attempt to auto-detect from the server response when authentication has failed due using to the wrong method.
Keyboard-interactive also technically allows for receiving multiple prompts from the server, and answering each in turn (with the same or other strings), so the ability to define multiple strings in turn might be useful but, at a minimum, supporting just the password going over the keyboard-interactive layer should be enough for most.
I am trying to determine a way whereby new VMs are automatically made available as deployment targets in Otter. They do not have to be assigned to envrionments yet -- this can be done later -- it's just the onboarding that I want to start with.
For the purposes of this question, assume that there are a significant number of VMs to create. There is a template VM, which has the Inedo agent installed. VMs will be created by copying the template in (for example) the ESXi shell and registered using vim-cmd. I won't know the IPs of the created VMs until after they have first booted and DHCP has assigned them.
The InedoAgent.config is only going to have a single encryption key (the one in the template), and outbound Connection defined to a central Otter server.
If I am reading the documentation correctly, in order for an outbound connection to be made by Inedo Agent, the outbound connection must know both the hostname (of the Otter server) and a key. The key however, must be generated on the Otter server and manually pasted into the InedoAgent.config file, and it must be unique for each individual agent instance.
It cannot be correct that I need to generate objects in Otter, just so Otter can generate a key for each of them, just so I can then manually provision the agent on each VM, so each VM can connect to Otter.
At any significant scale, I'd need an orchestration system to provision Otter, so I can use Otter as an orchestration system.
Although I can run rudimentary python scripts from the ESXi shell, even if I conconcted a script to call out to the Otter API and have it create the server object, provision the agent and retrieve the key (I've not tried this yet), I can't then get that key into the InedoAgent.config file.
Is there a better way?