waf - custom command tooltip - target

I create a custom command via
def spawn_pot(ctx):
ctx.recurse ('po')
class spawnpot(BuildContext):
cmd = 'spawnpot'
fun = 'spawn_pot'
and I would like to set a tooltip, but I could not find how to do that. Currently the waf --help looks like this (truncated):
Main commands (example: ./waf build -j4)
build : executes the build
...
updatepo :
...

Just describe your method/class by either "" or """ """
def spawn_pot(ctx):
""" Spawn Pot Target
Multiline
"""
ctx.recurse ('po')
class spawnpot(BuildContext):
"SpawnPot Build Context Description"
cmd = 'spawnpot'
fun = 'spawn_pot'
This should give you something like this:
the_prompt> waf -h
...
spawn_pot: Spawn Pot Target
MultiLine
spawnpot : SpawnPot Build Context Description
...

Related

Automatically reconfigure waf project, if configuration files change

I have a wscript which reads some files during the configure step and based on this sets some variabales.
How do I get waf to automatically re-configure the project, if one of the configuration files change, when running waf build instead of waf configure build?
Consider the following scenario:
waf configure
waf build
The content in configuration file a.config is changed
the user just runs waf build, instead of waf configure build.
--> How must the wscript look like, that it checks before running build if the configuration files have changed, and if so, the project is reconfigured before running build?
Example:
There is a file a.config and the wscript looks like this:
# wscript
def configure(conf):
a = conf.path.find_node('a.config')
conf.env.config = a.read()
def build(bld):
# check the configuration files are up to date.
# actual build
pass
configure is not really for that. You can use the same code but in build:
def build(bld):
a = bld.path.find_node('a.config')
bld.env.config = a.read()
bld(features = "myfeature", vars = ["config"], ...)
You can directly use configure with autoconfig:
from waflib import Configure
def options(opt):
Configure.autoconfig = True
def configure(conf):
conf.env.config = "my_value"
def my_processing(task):
print "Processing..."
def build(bld):
bld(rule = my_processing, vars = ["config"])
Any change to conf.env.config will trigger rebuild.
If you need to have separate config files, you can use load:
def configure(conf):
pass
def my_processing(task):
print "Processing..."
def build(bld):
bld.load("my_config", tooldir="my_config_dir")
bld(rule = my_processing, vars = ["config1", "config2"])
with a my_config_dir/my_config.py file like that:
def build(bld):
bld.env.config1 = "one"
bld.env.config2 = "two"
# ...
bld.load() will execute the build function in my_config.
Any change to config1 or config2 will trigger rebuild

run any command from configuration file

I write the shared library for jenkins where I have a method that read configuration file (yaml) and should execute commands based on the input.
example for configuration file
commands:
- name: command 1
command: "sh 'ls -la'"
- name: command 2
command: "readYaml file: 'demo.yaml'"
the method code
def command_executor(config){
config.commands.each { command ->
this.script.echo "running ${command.name} command"
// This is my problem how to run the command
command.command.execute().text
}
}
The above example is define in my class and I call it from /var/my_command_executer.groovy file
How I can run any command from the string parameter?
I found the below solution:
Create temporary groovy file the predefined method name that call to the command.
Load the temporary file in method and call the method.
Something like
def command_executor(config){
config.commands.each { command ->
this.script.echo "running ${command.name} command"
this.script.writeFile file: "temp.groovy" text: """
def my_command_executor(){
${command.command}
}
"""
def temp_command_executor = load "temp.groovy"
temp_command_executor.my_command_executor()
}
}

In Jenkins Groovy script, how can I grab a return value from a batch file

I am looking to grab the result of a batch file that is executed within a Jenkins pipeline Groovy script.
I know that I can do this:
def result = "pem.cmd Test_Application.pem".execute().text
However, I need to run a batch of commands and grab the result of the batch file. That example above only has one command. I need to first change directory and then execute the "cmd" file with a parameter. So I attempted the following:
def cmd = new StringBuilder()
cmd.append("CD \"${path}\"\n")
cmd.append("IF %ERRORLEVEL% NEQ 0 EXIT /B %ERRORLEVEL%\n")
cmd.append("pem.cmd Test_Application.pem\n")
//echo bat(returnStdout: true, script: cmd.toString())
def result = bat cmd.toString()
echo result
The "result" variable is null even though the log shows that the command did return a result. I know I could output the batch file results to text file, and read the text file, but I would just like to see if I can grab the result, like I attempted above. Any help is appreciated.
Ok, I got it work as follows:
def cmd = new StringBuilder()
cmd.append("CD \"${path}\"\n")
cmd.append("pem.cmd Test_Application.pem\n")
def x = bat(
returnStdout: true,
script: "${cmd.toString()}"
)
echo x
That does it.

Grails script and passing parameters using Groovy CLIBuilder?

I have created a very simple script and would like to pass arguments to the script.
like:
grails> helloworld -n Howdy
grails> helloworld -name Howdy
with the script:
target(main: 'Hello World') {
def cli = new CliBuilder()
cli.with
{
h(longOpt: 'help', 'Help - Usage Information')
n(longOpt: 'name', 'Name to say hello to', args: 1, required: true)
}
def opt = cli.parse(args)
if (!opt) return
if (opt.h) cli.usage()
println "Hello ${opt.n}"
}
I seem to fail in every attempt that i do. The script keeps complain about the -n option being not present.
When i debug the script the value op the args parameters looks like the values are rearranged.
When calling the script with :
grails> helloworld -n Howdy
the value of args inside the script is: Howdy -n
What am i missing here of doing wrong? Any suggestions?
Your problem is that you're running your code through grails shell. I've converted your code to CLI.groovy like this:
class CLI{
public static void main(String [] args){
def cli = new CliBuilder()
cli.with
{
h(longOpt: 'help', 'Help - Usage Information')
n(longOpt: 'name', 'Name to say hello to', args: 1, required: true)
}
def opt = cli.parse(args)
if (!opt) return
if (opt.h) cli.usage()
println "Hello ${opt.n}"
}
}
After that I'm using groovy command to run it from linux shell like that:
archer#capitan $ groovy CLI -n Daddy
It outputs:
archer#capitan $ groovy CLI -n Daddy
Hello Daddy
So it works like a charm.
I did a Google search for site:github.com grailsScript CliBuilder and came across:
https://github.com/Grails-Plugin-Consortium/grails-cxf/blob/master/scripts/WsdlToJava.groovy
That gave me the hint that the args variable needs to be formatted. Unfortunately it mutates -n Howdy into Howdy\n-n (not sure why the order is rearranged or the newline character is added).
The github page above has a doSplit() method to handle some of this, but it keeps the rearranged order. The best thing I've found is to remove the space between -n and Howdy, which will work with CliBuilder.
The following is what I have working:
target(main: 'Hello World') {
def cli = new CliBuilder()
cli.with
{
h(longOpt: 'help', 'Help - Usage Information')
n(longOpt: 'name', 'Name to say hello to', args: 1, required: true)
}
def ops = doSplit(args)
def opt = cli.parse(ops)
if (!opt) return
if (opt.h) cli.usage()
println "Hello ${opt.n}"
}
private doSplit(String string){
string.split(/(\n|[ ]|=)/).collect{ it.trim() }.findResults { it && it != '' ? it : null }
}
Run this with: helloworld -nHowdy

Getting the build status in post-build script

I would like to have a post-build hook or similar, so that I can have the same output as e. g. the IRC plugin, but give that to a script.
I was able to get all the info, except for the actual build status. This just doesn't work, neither as a "Post-build script", "Post-build task", "Parameterized Trigger" aso.
It is possible with some very ugly workarounds, but I wanted to ask, in case someone has a nicer option ... short of writing my own plugin.
It works as mentioned with the Groovy Post-Build Plugin, yet without any extra quoting within the string that gets executed. So I had to put the actual functionality into a shell script, that does a call to curl, which in turn needs quoting for the POST parameters aso.
def result = manager.build.result
def build_number = manager.build.number
def env = manager.build.getEnvironment(manager.listener)
def build_url = env['BUILD_URL']
def build_branch = env['SVN_BRANCH']
def short_branch = ( build_branch =~ /branches\//).replaceFirst("")
def host = env['NODE_NAME']
def svn_rev = env['SVN_REVISION']
def job_name = manager.build.project.getName()
"/usr/local/bin/skypeStagingNotify.sh Deployed ${short_branch} on ${host} - ${result} - ${build_url}".execute()
Use Groovy script in post-build step via Groovy Post-Build plugin. You can then access Jenkins internals via Jenkins Java API. The plugin provides the script with variable manager that can be used to access important parts of the API (see Usage section in the plugin documentation).
For example, here's how you can execute a simple external Python script on Windows and output its result (as well as the build result) to build console:
def command = """cmd /c python -c "for i in range(1,5): print i" """
manager.listener.logger.println command.execute().text
def result = manager.build.result
manager.listener.logger.println "And the result is: ${result}"
For this I really like the Conditional Build Step plugin. It's very flexible, and you can choose which actions to take based on build failure or success. For instance, here's a case where I use conditional build step to send a notification on build failure:
You can also use conditional build step to set an environment variable or write to a log file that you use in subsequent "execute shell" steps. So for instance, you might create a build with three steps: one step to compile code/run tests, another to set a STATUS="failed" environment variable, and then a third step which sends an email like The build finished with a status: ${STATUS}
Really easy solution, maybe not to elegant, but it works!
1: Catch all the build result you want to catch (in this case SUCCESS).
2: Inject an env variable valued with the job status
3: Do the Same for any kind of other status (in this case I catch from abort to unstable)
4: After you'll be able to use the value for whatever you wanna do.. in this case I'm passing it to an ANT script! (Or you can directly load it from ANT as Environment variable...)
Hope it can help!
Groovy script solution:-
Here I am using groovy script plugin to take the build status and setting it to the environmental variable, so the environmental variable can be used in post-build scripts using post-build task plugin.
Groovy script:-
import hudson.EnvVars
import hudson.model.Environment
def build = Thread.currentThread().executable
def result = manager.build.result.toString()
def vars = [BUILD_STATUS: result]
build.environments.add(0, Environment.create(new EnvVars(vars)))
Postscript:-
echo BUILD_STATUS="${BUILD_STATUS}"
Try Post Build Task plugin...
It lets you specify conditions based on the log output...
Basic solution (please don't laugh)
#!/bin/bash
STATUS='Not set'
if [ ! -z $UPSTREAM_BUILD_DIR ];then
ISFAIL=$(ls -l /var/lib/jenkins/jobs/$UPSTREAM_BUILD_DIR/builds | grep "lastFailedBuild\|lastUnsuccessfulBuild" | grep $UPSTREAM_BUILD_NR)
ISSUCCESS=$(ls -l /var/lib/jenkins/jobs/$UPSTREAM_BUILD_DIR/builds | grep "lastSuccessfulBuild\|lastStableBuild" | grep $UPSTREAM_BUILD_NR)
if [ ! -z "$ISFAIL" ];then
echo $ISFAIL
STATUS='FAIL'
elif [ ! -z "$ISSUCCESS" ]
then
STATUS='SUCCESS'
fi
fi
echo $STATUS
where
$UPSTREAM_BUILD_DIR=$JOB_NAME
$UPSTREAM_BUILD_NR=$BUILD_NUMBER
passed from upstream build
Of course "/var/lib/jenkins/jobs/" depends of your jenkins installation

Resources