Passing dust compilation options to Gulp Dust task - dust.js

I would like to pass various dustc options to gulp-dust in my node application. For eg, How can I set amd, cwd in gulp-dust. I understand we can pass name and preserveWhiteSpace in compilation options. But, what about other options?
Code Snippet:
var dust = require('gulp-dust');
...
gulp.task('buildTemplates', function(){
gulp.src('templates/page.dust').
pipe(dust()). //How will I say I need dust.config.amd=true
pipe.dest('dist');
});

Related

Using global shared libraries in Jenkins to define parameter options

I am trying to use a global class that I've defined in a shared library to help organise job parameters. It's not working, and I'm not even sure if it is possible.
My job looks something like this:
pipelineJob('My-Job') {
definition {
// Job definition goes here
}
parameters {
choiceParam('awsAccount', awsAccount.ALL)
}
}
In a file in /vars/awsAccount.groovy I have the following code:
class awsAccount implements Serializable {
final String SANDPIT = "sandpit",
final String DEV = "dev",
final String PROD = "prod"
static String[] ALL = [SANDPIT, DEV, PROD]
}
Global pipeline libraries are configured to load implicitly from the my repository's master branch.
When attempting to update the DSL scripts I receive the error:
ERROR: (myJob.groovy, line 67) No such property: awsAccount for class: javaposse.jobdsl.dsl.helpers.BuildParametersContext
Why does it not find the class, and is it even possible to use shared library classes like this in pipeline job?
Disclaimer: I know it works using Jenkinsfile. Unfortunatelly, not tested usng Declarative Pipelines - but no answers yet, so it may be worth a try
Regarding your first question: there are some reasons why a class from your shared-lib could not be found. Starting from the library import, the library syntax, etc. But they definitvely work for DSL. To be more precise about it, additional information would be great. But be sure that:
You have your groovy class definition using exactly the directory structure as described in the documentation (https://www.jenkins.io/doc/book/pipeline/shared-libraries/)
Give a name to the shared-lib in jenkins as you configure it and be sure is exactly the name you use in the import
Use the import as described in the documentation (under Using Libraries)
Regarding your second question (the one that names this SO question): yes, you can include parameter jobs from information in your shared-lib. At least, using Jenkinsfiles. You can even define properties to be included in the pipelie. I got it working with a tricky syntax due to different problems.
Again, I am using Jenkinsfile and this is what worked for me:
In my shared-lib class, I added a static function that introduces the build parameters. Notice the input parameters that function needs and its usage:
class awsAccount implements Serializable {
//
static giveMeParameters (script) {
return [
// Some parms
script.string(defaultValue: '', description: 'A default parameter', name: 'textParm'),
script.booleanParam(defaultValue: false, description: 'If set to True, do whatever you need - otherwise, do not do it', name: 'boolOption'),
]
}
}
To introduce those parameters in the pipeline, you need to place the returned value of the function into the parameters array
properties (
parameters (
awsAccount.giveMeParameters (this)
)
Again, notice the syntax when calling the function. Similar to this, you can also define functions in the shared-lib that return properties and use them in multiple jobs (disableConcurrentBuilds, buildDiscarder, etc)

Can I set a default test_arg for go_tests and override from CLI?

I use gazelle to generate BUILD files for a go package that has some non-go directories.
I'd like to add -test.short to the go_test runs by default, and then turn it back off, optionally, from CLI.
Adding --test_arg="-test.short" to the CLI does not work, since it gets passed to the non-Go tests.
If I could add something to WORKSPACE that modified the default args for go_test based on a select, I'd be good here. Or if I could persuade gazelle to generate my_go_test instead of go_test, I could do some Skylark. Am I missing any way to accomplish this?
I think you can use Bazel's config_setting and select to make this work. config_setting lets you define a predicate which is true or false, depending on command-line arguments. You can provide a --define argument that the config_setting will test. Then you can optionally pass an argument to tests using select.
Something like this might work for you. This will pass the -test.short argument to the test if you pass --define=short=true on the command line. No argument would be passed by default.
config_setting(
name = "short",
values = {
"define": "short=true",
},
)
go_test(
name = "go_default_test",
srcs = ["hello_test.go"],
args = select({
":short": ["-test.short"],
"//conditions:default": [],
}),
)

Jasmine Headless Webkit backbone and Handlebars

I am using the sht_rails gem to render handlebars templates in my Rails 3.2/Backbone App.
I'm hoping to use this .handlebars template in both the backbone and rails portion of the app, but so far I just have it working in the backbone.
I'm using it like so:
class MyApp.views.MyView extends MyApp.views.BaseView
template: SHT['templates/feed_item']
render: ->
data = {}
#$el.html #template(data)
#
This works great in app, no problems at all, my handlebars template is looking sweet.
However, this is no good for my js testing (I'm using Jasmine with jasmine-headless-webkit)
This is what happens:
$ jasmine-headless-webkit
ReferenceError: Can't find variable: SHT
This makes total sense, as it seems that the sht_rails gem registers the SHT variable, however, it doesn't seem to do this in when I test.
Is there a good way to register the SHT variable when running jhw? or jasmine by itself? I don't even need the template to render for my test, just knowing that the template is called would be enough for me. But for now, all my jasmine tests are broken until I figure out how to register this SHT.
Thanks!
We faced the same problem when using jade templates in our Rails 3.2/Backbone/Marionette app via the tilt-jade gem (hence the JST variable in the code samples below). Our solution was to create a template abstraction and then use Jasmine spies to fake a response during spec execution. This approach also allows us to test template usage, construction, etc.
Spies In General
In case you're unfamiliar with Jasmine spies:
Jasmine integrates 'spies' that permit many spying, mocking, and faking behaviors. A 'spy' replaces the function it is spying on.
Abstraction
Create the abstraction:
YourApp.tpl = function(key, data) {
var path = "templates";
path += key.charAt(0) === "/" ? key : "/" + key;
var templateFn = JST[path];
if(!templateFn) {
throw new Error('Template "' + path + '" not found');
}
if(typeof templateFn !== "function") {
throw new Error('Template "' + path + '" was a ' + typeof(templateFn) + '. Type "function" expected');
}
return templateFn(data);
};
and the requisite monkeypatch:
// MONKEYPATCH - Overriding Renderer to use YourApp template function
Marionette.Renderer = {
render: function(template, model){
return YourApp.tpl(template, model);
}
};
Template Abstraction Spy (spec_helper.js)
Now we can spy as follows:
spyOn(YourApp, 'tpl').andCallFake(function(key, data) {
return function() {
return key;
};
});
Bonus
Since we are spying on the YourApp.tpl function, we can also test against it:
expect(YourApp.tpl).toHaveBeenCalledWith("your_template", { model: model, property: value });
Addendum
If you don't already know about the jasmine-headless-webkit --runner-out flag and are debugging your Jasmine specs in the wilderness, check out this post to see how to generate a runner output report with a full backtrace for any failures.

grails: guidance on writing scripts, esp for calling existing database-migration scripts

My requirement is to invoke some processing from a Jenkins build server, to determine whether the domain model has changed since the last build. I've come to the conclusion that the way forward is to write a script that will invoke a sequence of existing scripts from the db-migration plugin. Then I can invoke it in the step that calls test-app and war.
I've looked in the Grails doc, and at some of the db-migration scripts, and I find I'm stuck - have no idea where to start trying things. I'd be really grateful if someone could point me at any suitable sources. BTW, I'm a bit rusty in Grails. Started to teach myself two years ago via proof of concept project, which lasted 6 months. Then it was back to Eclipse rich client work. That might be part of my problem, though I never go involved in scripts.
One thing I need in the Jenkins evt is to get hold of the current SVN revision number being used for the build. Suggestions welcome.
Regards, John
Create a new script by running grails create-script scriptname. The database-migration plugins scripts are configured to be easily reused. There are is a lot of shared code in _DatabaseMigrationCommon.groovy and each script defines one target with a unique name. So you can import either the shared script or any standalone script (or multiple scripts) and call the targets like they're methods.
By default the script generated by create-script "imports" the _GrailsInit script via includeTargets << grailsScript("_GrailsInit") and you can do the same, taking advantage of the magic variables that point at installed plugins' directories:
includeTargets << new File("$databaseMigrationPluginDir/scripts/DbmGenerateChangelog.groovy")
If you do this you can remove the include of _GrailsInit since it's already included, but if you don't that's fine since Grails only includes files once.
Then you can define your target and call any of the plugin's targets. The targets cannot accept parameters, but you can add data to the argsMap (this is a map Grails creates from the parsed commandline arguments) to simulate user-specified args. Note that any args passed to your script will be seen by the database-migration plugin's scripts since they use the same argsMap.
Here's an example script that just does the same thing as dbm-generate-changelog but adds a before and after message:
includeTargets << new File("$databaseMigrationPluginDir/scripts/DbmGenerateChangelog.groovy")
target(foo: "Just calls dbmGenerateChangelog") {
println 'before'
dbmGenerateChangelog()
println 'after'
}
setDefaultTarget foo
Note that I renamed the target from main to foo so it's unique, in case you want to call this from another script.
As an example of working with args, here's a modified version that specifies a default changelog name if none is provided:
println 'before'
if (!argsMap.params) {
argsMap.params = ['foo2.groovy']
}
dbmGenerateChangelog()
println 'after'
Edit: Here's a fuller example that captures the output of dbm-gorm-diff to a string:
includeTargets << new File("$databaseMigrationPluginDir/scripts/_DatabaseMigrationCommon.groovy")
target(foo: "foo") {
depends dbmInit
def configuredSchema = config.grails.plugin.databasemigration.schema
String argSchema = argsMap.schema
String effectiveSchema = argSchema ?: configuredSchema ?: defaultSchema
def realDatabase
boolean add = false // booleanArg('add')
String filename = null // argsList[0]
try {
printMessage "Starting $hyphenatedScriptName"
ByteArrayOutputStream baos = new ByteArrayOutputStream()
def baosOut = new PrintStream(baos)
ScriptUtils.executeAndWrite filename, add, dsName, { PrintStream out ->
MigrationUtils.executeInSession(dsName) {
realDatabase = MigrationUtils.getDatabase(effectiveSchema, dsName)
def gormDatabase = ScriptUtils.createGormDatabase(dataSourceSuffix, config, appCtx, realDatabase, effectiveSchema)
ScriptUtils.createAndPrintFixedDiff(gormDatabase, realDatabase, realDatabase, appCtx, diffTypes, baosOut)
}
}
String xml = new String(baos.toString('UTF-8'))
def ChangelogXml2Groovy = classLoader.loadClass('grails.plugin.databasemigration.ChangelogXml2Groovy')
String groovy = ChangelogXml2Groovy.convert(xml)
// do something with the groovy or xml here
printMessage "Finished $hyphenatedScriptName"
}
catch (e) {
ScriptUtils.printStackTrace e
exit 1
}
finally {
ScriptUtils.closeConnection realDatabase
}
}
setDefaultTarget foo

Is it possible to pass command-line arguments to a new isolate from spawnUri()

When starting a new isolate with spawnUri(), is it possible to pass command line args into that new isolate?
eg: Command line:
dart.exe app.dart "Hello World"
In app.dart
#import("dart:isolate");
main() {
var options = new Options();
print(options.arguments); // prints ["Hello World"]
spawnUri("other.dart");
}
In other.dart
main() {
var options = new Options();
print(options.arguments); // prints [] when spawned from app.dart.
// Is it possible to supply
// Options from another isolate?
}
Although I can pass data into other.dart through its SendPort, the specific use I want is to use another dart app that hasn't been created with a recievePort callback (such as pub.dart, or any other command-line app).
As far as I can tell the answer is currently no, and it would be hard to simulate via message passing because the options would not be available in main().
I think there are two good feature requests here. One is to be able to pass options on spawn() so that a script can run the same from the root isolate or a spawned isolate.
The other feature, which could be used to implement the first, is a way to pass messages that are handled by libraries before main() is invoked so that objects that main() depends on can be initialized with data from the spawning isolate.
Your example doesn't call print(options.arguments); in other.dart using the current stable SDK.
However
spanUri("other.dart");
spawns an Uri. So how about spawnUri("other.dart?param=value#orViaHash"); and try if you can find the param/value pair via
print(options.executable);
print(options.script);

Resources