Gradle: Set a JVM option on an ANT task - ant

I'm using Gradle 2.1 and have an ANT task defined something like this:
task myTask {
doFirst {
ant.taskdef(name: 'mytask',
classname: 'com.blah.Blah',
classpath: configurations.gen.asPath
)
ant.mytask(foo: 'bar')
}
}
There is a property I need to pass to the com.blah.Blah as a JVM argument (because, instead of doing something sane like passing parameter values in as parameters, the creators of this ANT task have decided that system properties are a reasonable way of conveying information). I've tried a number of things, including:
Setting the systemProperty on all tasks with JavaForkOptions:
tasks.withType(JavaForkOptions) {
systemProperty 'myproperty', 'blah'
}
Passing -Dmyproperty=blah when I invoke gradle.
Various things involving ant.systemPropery, ant.options.forkOptions, ant.forkOptions, etc. (I can't actually find reliable documentation on this anywhere)
I'm at a loss here. It feels like I should be able to say something like:
task myTask {
doFirst {
ant.taskdef(name: 'mytask',
classname: 'com.blah.Blah',
classpath: configurations.gen.asPath
)
ant.systemProperty 'myProperty', 'blah'
ant.mytask(foo: 'bar')
}
}
...but that obviously doesn't work.

In Gradle you can use Groovy so there's nothing preventing you from setting the system property programmatically as shown below:
task myTask {
doFirst {
System.setProperty('myProperty', 'blah')
// Use AntBuilder
System.clearProperty('myProperty')
}
}
Keep in mind that Gradle's AntBuilder executes Ant logic in the same process used for Gradle. Therefore, setting a system property will be available to other tasks in your build. This might have side effects when two tasks use the same system property (depending on the execution order) or if you run your build in parallel.
Instead you might want to change your Ant task to use Ant properties instead to drive your logic (if that's even an option). Ant properties can be set from Gradle as such:
task myTask {
doFirst {
ant.properties.myProperty = 'blah'
// Use AntBuilder
}
}

Related

Jenkins pipeline - how to load a Jenkinsfile without first calling node()?

I have a somewhat unique setup where I need to be able to dynamically load Jenkinsfiles that live outside of the src I'm building. The Jenkinsfiles themselves usually call node() and then some build steps. This causes multiple executors to be eaten up unnecessarily because I need to have already called node() in order to use the load step to run a Jenkinsfile, or to execute the groovy if I read the Jenkinsfile as a string and execute it.
What I have in the job UI today:
#Library(value='myGlobalLib#head', changelog=fase) _
node{
load "${JENKINSFILES_ROOT}/${PROJECT_NAME}/Jenkinsfile"
}
The Jenkinsfile that's loaded usually also calls node(). For example:
node('agent-type-foo'){
someBuildFlavor{
buildProperty = "some value unique to this build"
someConfig = ["VALUE1", "VALUE2", "VALUE3"]
runTestTarget = true
}
}
This causes 2 executors to be consumed during the pipeline run. Ideally, I load the Jenkinsfiles without first calling node(), but whenever I try, I get an error message stating:
"Required context class hudson.FilePath is missing
Perhaps you forgot to surround the code with a step that provides this, such as: node"
Is there any way to load a Jenkinsfile or execute groovy without first having hudson.FilePath context? I can't seem to find anything in the doc. I'm at the point where I'm going to preprocess the Jenkinsfiles to remove their initial call to node() and call node() with the value the Jenkinsfile was using, then load the rest of the file, but, that's somewhat too brittle for me to be happy with.
When using load step Jenkins evaluates the file. You can wrap your Jenkinsfile's logics into a function (named run() in my example) so that it will load but not run automatically.
def run() {
node('agent-type-foo'){
someBuildFlavor{
buildProperty = "some value unique to this build"
someConfig = ["VALUE1", "VALUE2", "VALUE3"]
runTestTarget = true
}
}
}
// This return statement is important in the end of Jenkinsfile
return this
Call it from your job script like this:
def jenkinsfile
node{
jenkinsfile = load "${JENKINSFILES_ROOT}/${PROJECT_NAME}/Jenkinsfile"
}
jenkinsfile.run()
This way there is no more nested node blocks because the first gets closed before run() function is called.

Start jenkins job immediately after creation by seed job, with parameters?

Start jenkins job immediately after creation by seed job
I can start a job from within the job dsl like this:
queue('my-job')
But how do I start a job with argument or parameters? I want to pass that job some arguments somehow.
Afaik, you can't.
But what you can do is creating it from a pipeline (jobDsl step), then run it. Something more or less like...
pipeline {
stages {
stage('jobs creation') {
steps {
jobDsl targets: 'my_job.dsl',
additionalParameters: [REQUESTED_JOB_NAME: "my_job's_name"]
build job: "my_job's_name",
parameters: [booleanParam(name: 'DRY_RUN', value: true)]
}
}
}
}
With a barebones 'my_job.dsl'...
pipelineJob(REQUESTED_JOB_NAME) {
definition {
// blah...
}
}
NOTE: As you see, I explicitly set the name of the job from the calling pipeline (the REQUESTED_JOB_NAME var) because otherwise I don't know how to make the jobDSL code to return the name of the job it creates back to the calling pipeline.
I use this "trick" to avoid the "job params go one run behind" problem. I use the DRY_RUN param of the job (I use a hidden param, in fact) to run a "do-nothing" build as its name implies, so by the time others need to use the job for "real stuff" its params section has already been properly parsed.

Providing different values in Jenkins dsl configure block to create different jobs

I need my builds to timeout at a specific time (deadline) but currently Jenkins dsl only support the "absolute" strategy. So I tried to write the configure block but couldn't create jobs with different deadline values.
def settings = [
[
jobname: 'job1',
ddl: '13:10:00'
],
[
jobname: 'job2',
ddl: '14:05:00'
]
]
for (i in settings) {
job(i.jobname) {
configure {
it / buildWrappers << 'hudson.plugins.build__timeout.BuildTimeoutWrapper' {
strategy(class:'hudson.plugins.build_timeout.impl.DeadlineTimeOutStrategy') {
deadlineTime(i.ddl)
deadlineToleranceInMinutes(1)
}
}
}
steps {
// some stuff to do here
}
}
}
The above script gives me two jobs with the same deadline time(14:05:00):
<project>
<actions></actions>
<description></description>
<keepDependencies>false</keepDependencies>
<properties></properties>
<scm class='hudson.scm.NullSCM'></scm>
<canRoam>true</canRoam>
<disabled>false</disabled>
<blockBuildWhenDownstreamBuilding>false</blockBuildWhenDownstreamBuilding>
<blockBuildWhenUpstreamBuilding>false</blockBuildWhenUpstreamBuilding>
<triggers></triggers>
<concurrentBuild>false</concurrentBuild>
<builders></builders>
<publishers></publishers>
<buildWrappers>
<hudson.plugins.build__timeout.BuildTimeoutWrapper>
<strategy class='hudson.plugins.build_timeout.impl.DeadlineTimeOutStrategy'>
<deadlineTime>14:05:00</deadlineTime>
<deadlineToleranceInMinutes>1</deadlineToleranceInMinutes>
</strategy>
</hudson.plugins.build__timeout.BuildTimeoutWrapper>
</buildWrappers>
</project>
I found this question but still couldn't get it to work.
You can use the Automatic Generated API
The generated DSL is only supported when running in Jenkins, e.g. it is
not available when running from the command line or in the Playground.
Use The Configure Block to generate custom config elements when not
running in Jenkins.
The generated DSL will not work for all plugins, e.g. if a plugin does
not use the #DataBoundConstructor and #DataBoundSetter annotations to
declare parameters. In that case The Configure Block can be used to
generate the config XML.
Fortunately the Timeout plugin support DataBoundConstructors
#DataBoundConstructor
public DeadlineTimeOutStrategy(String deadlineTime, int deadlineToleranceInMinutes) {
this.deadlineTime = deadlineTime;
this.deadlineToleranceInMinutes = deadlineToleranceInMinutes <= MINIMUM_DEADLINE_TOLERANCE_IN_MINUTES ? MINIMUM_DEADLINE_TOLERANCE_IN_MINUTES
: deadlineToleranceInMinutes;
}
So you should be able to do something like
def settings = [
[
jobname: 'job1',
ddl: '13:10:00'
],
[
jobname: 'job2',
ddl: '14:05:00'
]
]
for (i in settings) {
job(i.jobname) {
wrappers {
buildTimeoutWrapper {
strategy {
deadlineTimeOutStrategy {
deadlineTime(i.ddl)
deadlineToleranceInMinutes(1)
}
}
timeoutEnvVar('WHAT_IS_THIS_FOR')
}
}
steps {
// some stuff to do here
}
}
}
There is an extra layer in BuildTimeoutWrapper which houses the different strategies
When using nested classes you need to set the first letter of the class to lowercase
EDIT
You can see this in your own Jenkins install by using the 'Job DSL API Reference' link in a jobs page
http://<your jenkins>/plugin/job-dsl/api-viewer/index.html#method/javaposse.jobdsl.dsl.helpers.wrapper.WrapperContext.buildTimeoutWrapper
I saw very similar behaviour to this in a Jenkins DSL groovy script.
I was looping over a List of Maps in a for each, and I also have a configure closure like your example.
The behaviour I saw was that the Map object in the configure closure seemed to be the same for all iterations of the for each loop. Similar to how you are seeing the same deadline time.
I was actually referencing the same value in the Map both inside and outside the configure closure and the DSL was outputting different values. Outside the configure closure was as expected, but inside was the same value for all iterations.
My solution was just to use a variable to reference the Map value and use that both inside and outside the configure closure, when I did that, the value was consistent.
For your example (just adding a deadlineValue variable, and setting it outside the configure closure):
for (i in settings) {
def deadlineValue = i.ddl
job(i.jobname) {
configure {
it / buildWrappers << 'hudson.plugins.build__timeout.BuildTimeoutWrapper' {
strategy(class:'hudson.plugins.build_timeout.impl.DeadlineTimeOutStrategy') {
deadlineTime(deadlineValue)
deadlineToleranceInMinutes(1)
}
}
}
steps {
// some stuff to do here
}
}
}
I would not expect this to make a difference, but it worked for me.
However I agree as per the the other solution it is better to use buildTimeoutWrapper, so you can avoid using the configure block.
See: <Your Jenkins URL>/plugin/job-dsl/api-viewer/index.html#path/javaposse.jobdsl.dsl.DslFactory.job-wrappers-buildTimeoutWrapper-strategy-deadlineTimeOutStrategy for more details, obviously you'll need the Build Timeout plugin installed.
For my example I still needed the configure closure for the MultiJob plugin where some parameters were still not configurable via the DSL api.

Generate JAXB episode file with Gradle anttask

I set up a gradle task to generate java classes from XSD files:
ant.taskdef(name: 'xjc', classname: 'com.sun.tools.xjc.XJCTask', classpath: configurations.jaxb.asPath)
ant.jaxbTargetDir = jaxbTargetDir
ant.xjc(destdir: '${jaxbTargetDir}', package: 'com.example') {
schema(dir:'/home/bruckwald/proj/schema/xsd', includes: '*.xsd')
}
How can I pass the argument -episode my.episode to the ant task so that the episode file will be generated?
I'm using the following dependencies:
jaxb(
'com.sun.xml.bind:jaxb-core:2.2.11',
'com.sun.xml.bind:jaxb-impl:2.2.11',
'com.sun.xml.bind:jaxb-xjc:2.2.11',
'javax.xml.bind:jaxb-api:2.2.12',
'org.jvnet.jaxb2_commons:jaxb2-basics-ant:0.9.4'
)
Here's an example from a build of mine that passes other arguments to the XJC task:
ant.xjc(destdir: genDir, package: pkgName, extension: true) {
classpath { pathelement(path: configurations.xjcrun.asPath) }
schema(dir: "src/main/resources/schema", includes: schemaName)
arg(value: "-Xxew")
arg(value: "-Xfluent-api")
}
I would imagine your "-episode" arg would work just like that.
Note that the "arg" function takes a SINGLE argument. If you to specify a command-line option that takes a value besides the presence of the option itself, then you'll need TWO arg calls, one for the option string, and one for the value itself, so it might be like this:
arg(value: "-episode")
arg(value: "my.episode")

grails: guidance on writing scripts, esp for calling existing database-migration scripts

My requirement is to invoke some processing from a Jenkins build server, to determine whether the domain model has changed since the last build. I've come to the conclusion that the way forward is to write a script that will invoke a sequence of existing scripts from the db-migration plugin. Then I can invoke it in the step that calls test-app and war.
I've looked in the Grails doc, and at some of the db-migration scripts, and I find I'm stuck - have no idea where to start trying things. I'd be really grateful if someone could point me at any suitable sources. BTW, I'm a bit rusty in Grails. Started to teach myself two years ago via proof of concept project, which lasted 6 months. Then it was back to Eclipse rich client work. That might be part of my problem, though I never go involved in scripts.
One thing I need in the Jenkins evt is to get hold of the current SVN revision number being used for the build. Suggestions welcome.
Regards, John
Create a new script by running grails create-script scriptname. The database-migration plugins scripts are configured to be easily reused. There are is a lot of shared code in _DatabaseMigrationCommon.groovy and each script defines one target with a unique name. So you can import either the shared script or any standalone script (or multiple scripts) and call the targets like they're methods.
By default the script generated by create-script "imports" the _GrailsInit script via includeTargets << grailsScript("_GrailsInit") and you can do the same, taking advantage of the magic variables that point at installed plugins' directories:
includeTargets << new File("$databaseMigrationPluginDir/scripts/DbmGenerateChangelog.groovy")
If you do this you can remove the include of _GrailsInit since it's already included, but if you don't that's fine since Grails only includes files once.
Then you can define your target and call any of the plugin's targets. The targets cannot accept parameters, but you can add data to the argsMap (this is a map Grails creates from the parsed commandline arguments) to simulate user-specified args. Note that any args passed to your script will be seen by the database-migration plugin's scripts since they use the same argsMap.
Here's an example script that just does the same thing as dbm-generate-changelog but adds a before and after message:
includeTargets << new File("$databaseMigrationPluginDir/scripts/DbmGenerateChangelog.groovy")
target(foo: "Just calls dbmGenerateChangelog") {
println 'before'
dbmGenerateChangelog()
println 'after'
}
setDefaultTarget foo
Note that I renamed the target from main to foo so it's unique, in case you want to call this from another script.
As an example of working with args, here's a modified version that specifies a default changelog name if none is provided:
println 'before'
if (!argsMap.params) {
argsMap.params = ['foo2.groovy']
}
dbmGenerateChangelog()
println 'after'
Edit: Here's a fuller example that captures the output of dbm-gorm-diff to a string:
includeTargets << new File("$databaseMigrationPluginDir/scripts/_DatabaseMigrationCommon.groovy")
target(foo: "foo") {
depends dbmInit
def configuredSchema = config.grails.plugin.databasemigration.schema
String argSchema = argsMap.schema
String effectiveSchema = argSchema ?: configuredSchema ?: defaultSchema
def realDatabase
boolean add = false // booleanArg('add')
String filename = null // argsList[0]
try {
printMessage "Starting $hyphenatedScriptName"
ByteArrayOutputStream baos = new ByteArrayOutputStream()
def baosOut = new PrintStream(baos)
ScriptUtils.executeAndWrite filename, add, dsName, { PrintStream out ->
MigrationUtils.executeInSession(dsName) {
realDatabase = MigrationUtils.getDatabase(effectiveSchema, dsName)
def gormDatabase = ScriptUtils.createGormDatabase(dataSourceSuffix, config, appCtx, realDatabase, effectiveSchema)
ScriptUtils.createAndPrintFixedDiff(gormDatabase, realDatabase, realDatabase, appCtx, diffTypes, baosOut)
}
}
String xml = new String(baos.toString('UTF-8'))
def ChangelogXml2Groovy = classLoader.loadClass('grails.plugin.databasemigration.ChangelogXml2Groovy')
String groovy = ChangelogXml2Groovy.convert(xml)
// do something with the groovy or xml here
printMessage "Finished $hyphenatedScriptName"
}
catch (e) {
ScriptUtils.printStackTrace e
exit 1
}
finally {
ScriptUtils.closeConnection realDatabase
}
}
setDefaultTarget foo

Resources