Iterating over groovy file contents - jenkins

I have two groovy scipts (executed in Jenkins pipeline), one with variables:
// Params file
PARAM1 = "1"
PARAM2 = "2"
PARAM3 = "3"
PARAM4 = "4"
return this
and the second one that loads this file and uses the params:
def load_params() {
def parameters = load "params.groovy"
final_parameter = ""
for (parameter in parameters) {
if(final_parameter == "") {
final_parameter = parameter.key.toUpperCase() + "=" + parameter.value
} else {
final_parameter = final_parameter+ "&|&" + parameter.key.toUpperCase() + "=" + parameter.value
}
}
return final_parameter
}
return this
Issue is that doing the iteration over parameters is not working. The type is not a map so I cannot access variables like that.
I would use parameters.PARAM1 to handle it, but the first script is dynamic and names change so there is a need to do it without hard defining the name.
Is there any way to change/iterate the parameters to get the "key" and "values"?

Here are a few things you can do.
Option 01: Make the Parameters a Map
This is the best solution I guess.
// Params file
parameters = [ "PARAM1": "1", "PARAM2": "2", "PARAM3": "3", "PARAM4": "4"]
return this
Option 02 : Read the Parameters as a file
Instead of loading as a Script, you can read the Params as a File, then iterate over it line-by-line and read the parameters.
Option 03: Get all the Variables defined in the current binding.
This will probably return all the variables in the current Binding, so you may have to filter out the ones you defined in the param file.
def load_params() {
def parameters = load "params.groovy"
final_parameter = ""
paramMap = [:] << parameters.getBinding().getVariables()
for (parameter in paramMap) {
if(final_parameter == "") {
final_parameter = parameter.key.toUpperCase() + "=" + parameter.value
} else {
final_parameter = final_parameter+ "&|&" + parameter.key.toUpperCase() + "=" + parameter.value
}
}
return final_parameter
}
return this

Related

Gatling: How to pass jsonPath saved variable to another exec

I am new to Gatling and scala. facing issue on passing jsonpath saved variable from one repeat section to another forEach repeat section.
following variable "dcIds" is not able to pass to forEach section. Also please direct me to make the below code more better.
var dcIdd = ""
val r = new scala.util.Random
def orderRef() = r.nextInt(100)
def getCreateRequest: String = {
val data = s"""
[{
"name":"DC_${orderRef()}",
"location":"Seattle, Washington, USA",
"type":"Colocation"
}]
""".stripMargin
data
}
def createAppRequest: String = {
val data = s"""
[{
"name":"App_${orderRef()}",
"owner":"a#a.com",
"dataCenterId":"${dcIdd}",
"strategy":"Rehost",
"migrationStrategy":"Rehost"}]
}]
""".stripMargin
data
}
val scn = scenario("Add DC")
.repeat(DcIterations, "index") {
exec(
http("List_plans")
.get(uri2 + "?plan_id=")
.headers(headers_sec)
.resources(
http("DC add")
.post(uri2)
.headers(headers_sec)
.body(StringBody(session => getCreateRequest))
.check(jsonPath("$.ids[*]").findAll.saveAs("dcIds"))))
}
.foreach("${dcIds}", "dcId") {
dcIdd = "${dcId}"
repeat(AppIterations, "index") {
exec(http("Add Application")
.post(uri1 + "/applications/${dcId}")
.headers(headers_sec)
.body(StringBody(session => createAppRequest))
)
}
}

How to write a map to a YAML file in Dart

I have a map of key value pairs in Dart. I want to convert it to YAML and write into a file.
I tried using YAML package from dart library but it only provides methods to load YAML data from a file. Nothing is mentioned on how to write it back to the YAML file.
Here is an example:
void main() {
var map = {
"name": "abc",
"type": "unknown",
"internal":{
"name": "xyz"
}
};
print(map);
}
Expected output:
example.yaml
name: abc
type: unknown
internal:
name: xyz
How to convert the dart map to YAML and write it to a file?
It's a bit late of a response but for anyone else looking at this question I have written this class. It may not be perfect but it works for what I'm doing and I haven't found anything wrong with it yet. Might make it a package eventually after writing tests.
class YamlWriter {
/// The amount of spaces for each level.
final int spaces;
/// Initialize the writer with the amount of [spaces] per level.
YamlWriter({
this.spaces = 2,
});
/// Write a dart structure to a YAML string. [yaml] should be a [Map] or [List].
String write(dynamic yaml) {
return _writeInternal(yaml).trim();
}
/// Write a dart structure to a YAML string. [yaml] should be a [Map] or [List].
String _writeInternal(dynamic yaml, { int indent = 0 }) {
String str = '';
if (yaml is List) {
str += _writeList(yaml, indent: indent);
} else if (yaml is Map) {
str += _writeMap(yaml, indent: indent);
} else if (yaml is String) {
str += "\"${yaml.replaceAll("\"", "\\\"")}\"";
} else {
str += yaml.toString();
}
return str;
}
/// Write a list to a YAML string.
/// Pass the list in as [yaml] and indent it to the [indent] level.
String _writeList(List yaml, { int indent = 0 }) {
String str = '\n';
for (var item in yaml) {
str += "${_indent(indent)}- ${_writeInternal(item, indent: indent + 1)}\n";
}
return str;
}
/// Write a map to a YAML string.
/// Pass the map in as [yaml] and indent it to the [indent] level.
String _writeMap(Map yaml, { int indent = 0 }) {
String str = '\n';
for (var key in yaml.keys) {
var value = yaml[key];
str += "${_indent(indent)}${key.toString()}: ${_writeInternal(value, indent: indent + 1)}\n";
}
return str;
}
/// Create an indented string for the level with the spaces config.
/// [indent] is the level of indent whereas [spaces] is the
/// amount of spaces that the string should be indented by.
String _indent(int indent) {
return ''.padLeft(indent * spaces, ' ');
}
}
Usage:
final writer = YamlWriter();
String yaml = writer.write({
'string': 'Foo',
'int': 1,
'double': 3.14,
'boolean': true,
'list': [
'Item One',
'Item Two',
true,
'Item Four',
],
'map': {
'foo': 'bar',
'list': ['Foo', 'Bar'],
},
});
File file = File('/path/to/file.yaml');
file.createSync();
file.writeAsStringSync(yaml);
Output:
string: "Foo"
int: 1
double: 3.14
boolean: true
list:
- "Item One"
- "Item Two"
- true
- "Item Four"
map:
foo: "bar"
list:
- "Foo"
- "Bar"
package:yaml does not have YAML writing features. You may have to look for another package that does that – or write your own.
As as stopgap, remember JSON is valid YAML, so you can always write out JSON to a .yaml file and it should work with any YAML parser.
I ran into the same issue and ended up hacking together a simple writer:
// Save the updated configuration settings to the config file
void saveConfig() {
var file = _configFile;
// truncate existing configuration
file.writeAsStringSync('');
// Write out new YAML document from JSON map
final config = configToJson();
config.forEach((key, value) {
if (value is Map) {
file.writeAsStringSync('\n$key:\n', mode: FileMode.writeOnlyAppend);
value.forEach((subkey, subvalue) {
file.writeAsStringSync(' $subkey: $subvalue\n',
mode: FileMode.writeOnlyAppend);
});
} else {
file.writeAsStringSync('$key: $value\n',
mode: FileMode.writeOnlyAppend);
}
});
}

Emulate string to label dict

Since Bazel does not provide a way to map labels to strings, I am wondering how to work around this via Skylark.
Following my partial horrible "workaround".
First the statics:
_INDEX_COUNT = 50
def _build_label_mapping():
lmap = {}
for i in range(_INDEX_COUNT):
lmap ["map_name%s" % i] = attr.string()
lmap ["map_label%s" % i] = attr.label(allow_files = True)
return lmap
_LABEL_MAPPING = _build_label_mapping()
And in the implementation:
item_pairs = {}
for i in range(_INDEX_COUNT):
id = getattr(ctx.attr, "map_name%s" % i)
if not id:
continue
mapl = getattr(ctx.attr, "map_label%s" % i)
if len(mapl.files):
item_pairs[id] = list(mapl.files)[0].path
else:
item_pairs[id] = ""
if item_pairs:
arguments += [
"--map", str(item_pairs), # Pass json data
]
And then the rule:
_foo = rule(
implementation = _impl,
attrs = dict({
"srcs": attr.label_list(allow_files = True, mandatory = True),
}.items() + _LABEL_MAPPING.items()),
Which needs to be wrapped like:
def foo(map={}, **kwargs):
map_args = {}
# TODO: Check whether order of items is defined
for i, item in enumerate(textures.items()):
key, value = item
map_args["map_name%s" % i] = key
map_args["map_label%s" % i] = value
return _foo(
**dict(map_args.items() + kwargs.items())
)
Is there a better way of doing that in Skylark?
To rephrase your question, you want to create a rule attribute mapping from string to label?
This is currently not supported (see list of attributes), but you can file a feature request for this.
Do you think using "label_keyed_string_dict" is a reasonable workaround? (it won't work if you have duplicated keys)

DustJs - Helpers rendering

I start with DustJs in KrakenJs environment and i have some troubles with Dust helpers.
In fact, i want to create a helper that can create for me a simple bootstrap button.
Here is my code :
var dust = require('dustjs-linkedin');
if (!dust.helpers)
dust.helpers = {};
dust.helpers.bootstrapButton = function (chunk, context, bodies, params) {
var body = bodies.block || '',
options = params || {},
btnStyle = options.style || 'default',
btnClass = options.class || '',
btnSize = options.size || '';
btnStyle = 'btn btn-' + btnStyle;
if (btnSize)
btnSize = 'btn-' + btnSize;
return chunk.write('<button class="' + btnClass + btnStyle + btnSize + '">' + body + '</button>');
};
And when i call this helper i have the render function for body instead of the final text for body (button content : "function body_3(chk,ctx){ctx=ctx.shiftBlocks(blocks);return chk.write("test");}")
I tried to user chunk.render but i have an error because my final html is not a function like body.
Do you have any idea ?
Regards,
Guillaume
The body is an unevaluated chunk which you need to evaluate before you can concatenate it with your strings.
var curChunk = chunk.data.join(); // Capture anything in chunk prior to this helper
chunk.data = []; // Empty current chunk
var body = bodies.block(chunk).data.join() || '', // Evaluate block and make a string of it
.......
return chunk.write(curChunk + '<button class="' + btnClass + btnStyle + btnSize + '">' + body + '</button>'); // Prefix output with any earlier chunk contents and then build your tag.

Getting info on Groovy functions (name, signature, body code)

I have a Groovy file containing a bunch of simple functions like so:
// useful functions
def myFunc1(String arg) {
println("Hello " + arg)
}
def myFunc2(String arg) {
println("Goodbye " + arg)
}
I'd like to obtain from this:
the method name
the arguments
the body code of the function
(All as simple strings, I don't need to run anything yet.)
I was about to resort to some Regexing, but since I'm using a JVM language (Scala) I figured I might be able to use some of the Groovy compiler's stuff to do this a "nicer" way.
There seems to be a fair bit of information on loading Groovy code dynamically and running it, but not so much on introspecting the source. Any ideas?
(Failing a "nice" way, I'll also accept some Scala-foo to parse the information in a succinct fashion.)
This works, and demonstrates the token types required to find each node of importance in the AST. Hope it makes sense... By using lots of Groovy dynamism, I hope I haven't made it too hard for a port to Scala :-(
import org.codehaus.groovy.antlr.*
import org.codehaus.groovy.antlr.parser.*
import static org.codehaus.groovy.antlr.parser.GroovyTokenTypes.*
def code = '''
// useful functions
def myFunc1(String arg) {
println("Hello " + arg)
}
def myFunc2(arg, int arg2) {
println("Goodbye " + arg)
}
public String stringify( int a ) {
"$a"
}
'''
def lines = code.split( '\n' )
// Generate a GroovyRecognizer, compile an AST and assign it to 'ast'
def ast = new SourceBuffer().with { buff ->
new UnicodeEscapingReader( new StringReader( code ), buff ).with { read ->
read.lexer = new GroovyLexer( read )
GroovyRecognizer.make( read.lexer ).with { parser ->
parser.sourceBuffer = buff
parser.compilationUnit()
parser.AST
}
}
}
// Walks the ast looking for types
def findByPath( ast, types, multiple=false ) {
[types.take( 1 )[ 0 ],types.drop(1)].with { head, tail ->
if( tail ) {
findByPath( ast*.childrenOfType( head ).flatten(), tail, multiple )
}
else {
ast*.childrenOfType( head ).with { ret ->
multiple ? ret[ 0 ] : ret.head()[0]
}
}
}
}
// Walk through the returned ast
while( ast ) {
def methodModifier = findByPath( ast, [ MODIFIERS ] ).firstChild?.toStringTree() ?: 'public'
def returnType = findByPath( ast, [ TYPE, IDENT ] ) ?: 'Object'
def methodName = findByPath( ast, [ IDENT ] )
def body = findByPath( ast, [ SLIST ] )
def parameters = findByPath( ast, [ PARAMETERS, PARAMETER_DEF ], true ).collect { param ->
[ type: findByPath( param, [ TYPE ] ).firstChild?.toStringTree() ?: 'Object',
name: findByPath( param, [ IDENT ] ) ]
}
def (y1,y2,x1,x2) = [ body.line - 1, body.lineLast - 1, body.column - 1, body.columnLast ]
// Grab the text from the original string
def snip = [ lines[ y1 ].drop( x1 ), // First line prefix stripped
*lines[ (y1+1)..<y2 ], // Mid lines
lines[ y2 ].take( x2 ) ].join( '\n' ) // End line suffix stripped
println '------------------------------'
println "modifier: $methodModifier"
println "returns: $returnType"
println "name: $methodName"
println "params: $parameters"
println "$snip\n"
// Step to next branch and repeat
ast = ast.nextSibling
}
It prints out:
------------------------------
modifier: public
returns: Object
name: myFunc1
params: [[type:String, name:arg]]
{
println("Hello " + arg)
}
------------------------------
modifier: public
returns: Object
name: myFunc2
params: [[type:Object, name:arg], [type:int, name:arg2]]
{
println("Goodbye " + arg)
}
------------------------------
modifier: public
returns: String
name: stringify
params: [[type:int, name:a]]
{
"$a"
}
Hope it helps, or points you in the right direction :-)

Resources