I set up a gradle task to generate java classes from XSD files:
ant.taskdef(name: 'xjc', classname: 'com.sun.tools.xjc.XJCTask', classpath: configurations.jaxb.asPath)
ant.jaxbTargetDir = jaxbTargetDir
ant.xjc(destdir: '${jaxbTargetDir}', package: 'com.example') {
schema(dir:'/home/bruckwald/proj/schema/xsd', includes: '*.xsd')
}
How can I pass the argument -episode my.episode to the ant task so that the episode file will be generated?
I'm using the following dependencies:
jaxb(
'com.sun.xml.bind:jaxb-core:2.2.11',
'com.sun.xml.bind:jaxb-impl:2.2.11',
'com.sun.xml.bind:jaxb-xjc:2.2.11',
'javax.xml.bind:jaxb-api:2.2.12',
'org.jvnet.jaxb2_commons:jaxb2-basics-ant:0.9.4'
)
Here's an example from a build of mine that passes other arguments to the XJC task:
ant.xjc(destdir: genDir, package: pkgName, extension: true) {
classpath { pathelement(path: configurations.xjcrun.asPath) }
schema(dir: "src/main/resources/schema", includes: schemaName)
arg(value: "-Xxew")
arg(value: "-Xfluent-api")
}
I would imagine your "-episode" arg would work just like that.
Note that the "arg" function takes a SINGLE argument. If you to specify a command-line option that takes a value besides the presence of the option itself, then you'll need TWO arg calls, one for the option string, and one for the value itself, so it might be like this:
arg(value: "-episode")
arg(value: "my.episode")
Related
My motivation: our codebase is scattered across over at least 20 git repos. I want to consolidate everything into a single git repo with a single build system. Currently we use SBT, but we think the build would take too long, so I am examining the possibility of using Bazel instead.
Most of our codebase uses Scala 2.12, some of our codebase uses Scala 2.11, and the rest needs to build under both Scala 2.11 and Scala 2.12.
I'm trying to use bazelbuild/rules_scala.
With the following call to scala_repositories in my WORKSPACE, I can build using Scala 2.12:
scala_repositories(("2.12.6", {
"scala_compiler": "3023b07cc02f2b0217b2c04f8e636b396130b3a8544a8dfad498a19c3e57a863",
"scala_library": "f81d7144f0ce1b8123335b72ba39003c4be2870767aca15dd0888ba3dab65e98",
"scala_reflect": "ffa70d522fc9f9deec14358aa674e6dd75c9dfa39d4668ef15bb52f002ce99fa"
}))
If I have the following call instead, I can build using Scala 2.11:
scala_repositories(("2.11.12", {
"scala_compiler": "3e892546b72ab547cb77de4d840bcfd05c853e73390fed7370a8f19acb0735a0",
"scala_library": "0b3d6fd42958ee98715ba2ec5fe221f4ca1e694d7c981b0ae0cd68e97baf6dce",
"scala_reflect": "6ba385b450a6311a15c918cf8688b9af9327c6104f0ecbd35933cfcd3095fe04"
}))
However, it is not possible to specify in my BUILD files on a package level which version(s) of Scala to build with. I must specify this globally in my WORKSPACE.
To workaround this, my plan is to set up configurable attributes, so I can specify --define scala=2.11 to build with Scala 2.11, and specify --define scala=2.12 to build with Scala 2.12.
First I tried by putting this code in my WORKSPACE:
config_setting(
name = "scala-2.11",
define_values = {
"scala": "2.11"
}
)
config_setting(
name = "scala-2.12",
define_values = {
"scala": "2.12"
}
)
scala_repositories(
select(
{
"scala-2.11": "2.11.12",
"scala-2.12": "2.12.6"
}
),
select(
{
"scala-2.11": {
"scala_compiler": "3e892546b72ab547cb77de4d840bcfd05c853e73390fed7370a8f19acb0735a0",
"scala_library": "0b3d6fd42958ee98715ba2ec5fe221f4ca1e694d7c981b0ae0cd68e97baf6dce",
"scala_reflect": "6ba385b450a6311a15c918cf8688b9af9327c6104f0ecbd35933cfcd3095fe04",
},
"scala-2.12": {
"scala_compiler": "3023b07cc02f2b0217b2c04f8e636b396130b3a8544a8dfad498a19c3e57a863",
"scala_library": "f81d7144f0ce1b8123335b72ba39003c4be2870767aca15dd0888ba3dab65e98",
"scala_reflect": "ffa70d522fc9f9deec14358aa674e6dd75c9dfa39d4668ef15bb52f002ce99fa"
}
}
)
)
But this gave me the error config_setting cannot be in the WORKSPACE file.
So then I tried moving code into a Starlark file.
In tools/build_rules/scala.bzl:
config_setting(
name = "scala-2.11",
define_values = {
"scala": "2.11"
}
)
config_setting(
name = "scala-2.12",
define_values = {
"scala": "2.12"
}
)
def scala_version():
return select(
{
"scala-2.11": "2.11.12",
"scala-2.12": "2.12.6"
}
)
def scala_machinery():
return select(
{
"scala-2.11": {
"scala_compiler": "3e892546b72ab547cb77de4d840bcfd05c853e73390fed7370a8f19acb0735a0",
"scala_library": "0b3d6fd42958ee98715ba2ec5fe221f4ca1e694d7c981b0ae0cd68e97baf6dce",
"scala_reflect": "6ba385b450a6311a15c918cf8688b9af9327c6104f0ecbd35933cfcd3095fe04",
},
"scala-2.12": {
"scala_compiler": "3023b07cc02f2b0217b2c04f8e636b396130b3a8544a8dfad498a19c3e57a863",
"scala_library": "f81d7144f0ce1b8123335b72ba39003c4be2870767aca15dd0888ba3dab65e98",
"scala_reflect": "ffa70d522fc9f9deec14358aa674e6dd75c9dfa39d4668ef15bb52f002ce99fa"
}
}
)
And back in my WORKSPACE:
load("//tools/build_rules:scala.bzl", "scala_version", "scala_machinery")
scala_repositories(scala_version(), scala_machinery())
But now I get this error:
tools/build_rules/scala.bzl:1:1: name 'config_setting' is not defined
This confuses me, because I thought config_setting() was built in. I can't find where I should load it in from.
So, my questions:
How do I load config_setting() into my .bzl file?
Or, is there a better way of controlling from the command line which arguments get passed to scala_repositories()?
Or, is this just not possible?
$ bazel version
Build label: 0.17.2-homebrew
Build target: bazel-out/darwin-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
Build time: Fri Sep 28 10:42:37 2018 (1538131357)
Build timestamp: 1538131357
Build timestamp as int: 1538131357
If you call native code from a bzl file, you must use the native. prefix, so in this case you would call native.config_setting.
However, this is going to lead to the same error: config_setting is a BUILD rule, not a WORKSPACE rule.
If you want to change the build tool used for a particular target, you can change the toolchain, and this seems to be supported via the scala_toolchain
And I believe you can use a config to select the toolchain.
I'm unfamiliar with what scala_repositories does. I hope it defines the toolchain with a proper versioned name, so that you can reference the wanted toolchain correctly. And I hope you can invoke it twice in the same workspace, otherwise I think there is no solution.
Update:
From the bottom of the Automatically Generated DSL wiki entry ... The generated DSL is only supported when running in Jenkins,....
Since slackNotifier is generated DSL, it doesn't appear that there is a way to test this in our particular infrastructure. We're going to write a function which generates the config using the configure block.
I have a seed job definition which is failing gradle test even though it seems to work fine when we use it in Jenkins.
Job Definition Excerpt
//package master
// GitURL
def gitUrl = 'https://github.com/team/myapp'
def slackRoom = null
job('seed-dsl') {
description('This seed is updated from the seed-dsl-updater job')
properties {
//Set github project URL
githubProjectUrl(gitUrl)
}
...
// publishers is another name for post build steps
publishers {
mailer('', false, true)
slackNotifier {
room(slackRoom)
notifyAborted(true)
notifyFailure(true)
notifyNotBuilt(true)
notifyUnstable(true)
notifyBackToNormal(true)
notifySuccess(false)
notifyRepeatedFailure(false)
startNotification(false)
includeTestSummary(false)
includeCustomMessage(false)
customMessage(null)
buildServerUrl(null)
sendAs(null)
commitInfoChoice('NONE')
teamDomain(null)
authToken(null)
}
}
}
The gradle test command works fine when I comment out the with the slackNotifier declaration, but fail with the following error when it's enabled:
Test output excerpt
Caused by:
javaposse.jobdsl.dsl.DslScriptException: (script, line 79) No signature of method: javaposse.jobdsl.dsl.helpers.publisher.PublisherContext.slackNotifier() is applicable for argument types: (script$_run_closure1$_closure9$_closure14) values: [script$_run_closure1$_closure9$_closure14#d2392a1]
Possible solutions: stashNotifier(), stashNotifier(groovy.lang.Closure)
at javaposse.jobdsl.dsl.DslScriptLoader.runScriptEngine(DslScriptLoader.groovy:135)
at javaposse.jobdsl.dsl.DslScriptLoader.runScriptsWithClassLoader_closure1(DslScriptLoader.groovy:78)
According to the migration doc, slackNotifer has been supported since 1.47. In my gradle.build, I'm using 1.48. I see the same errors with plugin version 1.50
gradle.build excerpt
ext {
jobDslVersion = '1.48'
...
}
...
// Job DSL plugin including plugin dependencies
testCompile "org.jenkins-ci.plugins:job-dsl:${jobDslVersion}"
testCompile "org.jenkins-ci.plugins:job-dsl:${jobDslVersion}#jar"
...
The gradle.build also includes the following, as suggested by the [testing docs] *(https://github.com/jenkinsci/job-dsl-plugin/wiki/Testing-DSL-Scripts).
testPlugins 'org.jenkins-ci.plugins:slack:2.0.1'
What do I need to do to be able to successfully test my job definitions. Is this a bug, or have I missed something else?
removed incorrect reply
EDIT
I see I missed the point.
The new approach is to reuse the #DataBoundConstructor exposed by plugins, so nothing needs to be written to support a new plugin assuming it has a DataBoundConstructor
Your SlackNotifier has this - note the DSL converts the lowercase first letter for you
#DataBoundConstructor
public SlackNotifier(
final String teamDomain,
final String authToken,
final String room,
final String buildServerUrl,
final String sendAs,
final boolean startNotification,
final boolean notifyAborted,
final boolean notifyFailure,
final boolean notifyNotBuilt,
final boolean notifySuccess,
final boolean notifyUnstable,
final boolean notifyBackToNormal,
final boolean notifyRepeatedFailure,
final boolean includeTestSummary,
CommitInfoChoice commitInfoChoice,
boolean includeCustomMessage,
String customMessage) {
...
}
Unfortunately there is an embedded type in the parameter list CommitInfoChoice and this does not have a DataBoundConstructor and its an enum too.
public enum CommitInfoChoice {
NONE("nothing about commits", false, false),
AUTHORS("commit list with authors only", true, false),
AUTHORS_AND_TITLES("commit list with authors and titles", true, true);
...
}
I'll go out on a limb and say that it won't work out the box until the nested enum implements a databound constructor and also has a descriptor, sorry.
I don't have the plugin but you can look at the XML for a real created job with the plugin and see what goes into this section. I suspect it is a nested structure
You can try the job dsl google group - link to a post about the generic approach
We ran into this as well. The solution for us was to add the slack plugin version we were using on jenkins to our list of plugins in gradle.
To be more specific, in our build.gradle file under dependencies, we added the following code to get our plugins included and hence allow the auto-generated DSL to work.
You can see this described here and an example of a different plugin next to testPlugins:
https://github.com/jenkinsci/job-dsl-plugin/wiki/Testing-DSL-Scripts
Like the following:
dependencies {
...
// plugins to install in test instance
testPlugins 'org.jenkins-ci.plugins:ghprb:1.31.4'
testPlugins 'com.coravy.hudson.plugins.github:github:1.19.0'
}
I am writing a custom grails script. I want custom help, options etc.
According to doc (
http://grails.github.io/grails-doc/latest/guide/commandLine.html#creatingCustomScripts), I just need to do:
description( "Generates a controller that performs CRUD operations and the associated views" ) {
usage "grails generate-all [DOMAIN CLASS]"
flag name:'force', description:"Whether to overwrite existing files"
argument name:'Domain Class', description:'The name of the domain class'
}
However when I add that to my script, I get:
Warning: Error caching created help for /server/scripts/ExecuteDBScript.groovy: No signature of method: ExecuteDBScript.description() is applicable for argument types: (java.lang.String, ExecuteDBScript$_run_closure1) values: [Generates a controller that performs CRUD operations and the associated views, ...]
My script looks like:
includeTargets << grailsScript("_GrailsInit")
description( "Generates a controller that performs CRUD operations and the associated views" ) {
usage "grails generate-all [DOMAIN CLASS]"
flag name:'force', description:"Whether to overwrite existing files"
argument name:'Domain Class', description:'The name of the domain class'
}
/**
* Script to execute the DB script.
*
*/
target(main: "This script executes DB Script") {
...
}
Any ideas?
Documentation is poor, but I found this solution:
includeTargets << grailsScript("_GrailsBootstrap")
USAGE = """
grails script-name [PARAM]
where
PARAM = Description
"""
target (default: "command description") {
//...
}
Your link (http://grails.github.io/grails-doc/latest/guide/commandLine.html#creatingCustomScripts) refers to the latest version of grails, a non-stable version (3.0.0.M2).
Probably you are using the latest stable version, 2.4.4, so the correct docs are here: http://grails.github.io/grails-doc/2.4.4/ref/Command%20Line/create-script.html
I'm using Gradle 2.1 and have an ANT task defined something like this:
task myTask {
doFirst {
ant.taskdef(name: 'mytask',
classname: 'com.blah.Blah',
classpath: configurations.gen.asPath
)
ant.mytask(foo: 'bar')
}
}
There is a property I need to pass to the com.blah.Blah as a JVM argument (because, instead of doing something sane like passing parameter values in as parameters, the creators of this ANT task have decided that system properties are a reasonable way of conveying information). I've tried a number of things, including:
Setting the systemProperty on all tasks with JavaForkOptions:
tasks.withType(JavaForkOptions) {
systemProperty 'myproperty', 'blah'
}
Passing -Dmyproperty=blah when I invoke gradle.
Various things involving ant.systemPropery, ant.options.forkOptions, ant.forkOptions, etc. (I can't actually find reliable documentation on this anywhere)
I'm at a loss here. It feels like I should be able to say something like:
task myTask {
doFirst {
ant.taskdef(name: 'mytask',
classname: 'com.blah.Blah',
classpath: configurations.gen.asPath
)
ant.systemProperty 'myProperty', 'blah'
ant.mytask(foo: 'bar')
}
}
...but that obviously doesn't work.
In Gradle you can use Groovy so there's nothing preventing you from setting the system property programmatically as shown below:
task myTask {
doFirst {
System.setProperty('myProperty', 'blah')
// Use AntBuilder
System.clearProperty('myProperty')
}
}
Keep in mind that Gradle's AntBuilder executes Ant logic in the same process used for Gradle. Therefore, setting a system property will be available to other tasks in your build. This might have side effects when two tasks use the same system property (depending on the execution order) or if you run your build in parallel.
Instead you might want to change your Ant task to use Ant properties instead to drive your logic (if that's even an option). Ant properties can be set from Gradle as such:
task myTask {
doFirst {
ant.properties.myProperty = 'blah'
// Use AntBuilder
}
}
I found this question:
How can I get a list of build targets in Ant?
What I'd like to know: Is there a way to get a list of targets, together with their depends-on values? We have a large build.xml file and the way it's currently written the presence or absence of a description doesn't really tell me much as to whether a target is a main target or an "other" target.
Running ant 1.8.1, this is an initial bit of due diligence as I prepare to upgrade to Gradle so I need to figure out which targets are truly the "high level" targets and which ones are "supporting" targets.
Note I work in a locked-down environment so downloading third-party software or ant extensions is not an option.
Additional Note If this level of detail is not possible in ant, that is a valid answer as well
In Ant 1.8.2 and above, use the -d flag to print debug info:
ant -p -d <your main build file>
and you'll get details like this:
javadoc
depends on: resolve
javadoc.distribute
latest-ivy
package
depends on: -maybe-package-by-bom, -maybe-package-by-spec, -maybe-package-for-dc
The -d flag will also print the "other" targets (those without descriptions) that aren't printed by ant -p, along with their dependencies.
If you want a recursive tree listing, you can use this XQuery script with Saxon:
(:~
: XQuery to display the dependencies of an Ant target.
:
: There are two modes of operation:
: 1) Display all targets and immediate dependencies, specified by $project-file
: 2) Show a tree of a single targets dependencies, this happens when $target-name is set as well.
:
: External parameters:
: $project-file The initial Ant file to start parsing from (imports will be expanded)
: $target-name If specified we examine only a single target and produce a tree of all dependencies (recursively)
: $show-file Whether the file path of the dependency should be shown
:
: Example Usage: java -cp Saxon-HE-9.7.0-18.jar net.sf.saxon.Query -q:ant-show-deps.xqy \!indent=yes project-file=file:/Users/are/exist-git/build.xml target-name=installer show-file=true
:
: If you don't want to specify the $target-name you can pass ?target-name=\(\) to Saxon on the command line.
:
: #author Adam Retter
:)
xquery version "1.0";
declare variable $project-file external;
declare variable $target-name as xs:string? external;
declare variable $show-file as xs:boolean external;
declare function local:expand-import-targets($file as xs:string) as element(target)* {
local:expand-import-targets($file, ())
};
declare function local:expand-import-targets($file as xs:string, $visited as xs:string*) as element(target)* {
let $path := local:resolve($file, $visited[1])
return
if(not($visited = $path))then
let $imported-project := doc($path)/project
return
(
for $target in $imported-project/target
return
<target name="{$target/#name}" file="{$path}">
{
for $dependency in tokenize(replace($target/#depends, '\s+', ''), ',')
return
<dependency name="{$dependency}"/>
}
</target>
,
for $import in $imported-project/import
return
local:expand-import-targets($import/#file, ($path, $visited))
)
else()
};
declare function local:resolve($file as xs:string, $prev-file as xs:string?) {
if(not($prev-file))then
$file
else if(starts-with($file, "/") or starts-with($file, "file:/"))then
$file
else
resolve-uri($file, $prev-file)
};
declare function local:target-tree($target-name as xs:string, $targets as element(target)*) as element(target)? {
let $target := $targets[#name eq $target-name]
return
element target {
$target/#name,
if($show-file)then
$target/#file
else(),
for $dependency in $target/dependency
return
local:expand-dependency($dependency/#name, $targets)
}
};
declare function local:expand-dependency($dependency-name as xs:string, $targets as element(target)*) {
for $expanded in $targets[#name eq $dependency-name]
return
element dependency {
$expanded/#name,
if($show-file)then
$expanded/#file
else(),
for $sub-dependency in $expanded/dependency
return
local:expand-dependency($sub-dependency/#name, $targets)
}
};
let $targets := local:expand-import-targets($project-file)
return
if($target-name)then
local:target-tree($target-name, $targets)
else
<targets>
{
for $target in $targets
order by $target/#name
return $target
}
</targets>