Question: Is there a command in FAKE available which prints all the defined targets in the build script?
I want to setup my FAKE build in such a way that it prints a list of all available targets in the build script when I do not specify a target.
For example:
> build.cmd
Available targets:
- Clean
Depends on: []
- DeleteBinObj
Depends on: []
- RestorePackages
Depends on: ["Clean"]
- Build
Depends on: ["RestorePackages"]
- CopyBinaries
Depends on: ["RunTests"]
- RunTests
Depends on: ["Build"]
- Default
Depends on: ["CopyBinaries"]
In the FAKE build script I would define something like:
Target "Default" (fun _ ->
listTargets
)
RunTargetOrDefault "Default"
The only thing that is missing it the command listTargets.
In your build.cmd replace
packages\FAKE\tools\FAKE.exe build.fsx %*
with
if [%1] == [] (
packages\FAKE\tools\FAKE.exe build.fsx --listTargets
) else (
packages\FAKE\tools\FAKE.exe build.fsx %*
)
Related
I have a Bazel executable target (of type fsharp_binary, but I don't think it should matter) that I can run using bazel run.
bazel run //my_app.exe
I would like to use this executable as a test, so that when I call bazel test it gets built and executed, and a non-zero exit code is considered a test failure.
bazel test //...
What I am looking for is something like this:
test_of_executable(
name = "my_test",
executable = "//my_app.exe",
success_codes = [ 0 ],
)
Then:
bazel test //:my_test
How can I achieve this in Bazel?
Just wrap your app as a sh_test. See for example https://github.com/bazelbuild/bazel/issues/1969.
What I use in my codebase is:
BUILD.bazel:
sh_test(
name = "test",
srcs = ["test.sh"],
data = [
"//:some_binary",
],
)
test.sh
some_project/some_subdir/some_binary
See here for an real example.
We want to define the project version for our Gradle project during the build of our project in the Jenkins pipeline, which will include a timestamp and a git-commit-id. (20180625180158-b8ad8df0dc0356a91707eaa241de7d62df6a29f2)
void defineVersion() {
sh "git rev-parse HEAD > .git/commit-id"
commitId = readFile('.git/commit-id')
timestamp = getCurrentTimestamp()
version = timestamp+'-'+commitId
}
This function will determine the version I want to publish our artifact with.
Next I use the Artifactory Gradle plugin to publish, but I can't find a way to set/override the project version. I want the jar to be published with version 20180625180158-b8ad8df0dc0356a91707eaa241de7d62df6a29f2
version = defineVersion() // how can we incorperate this version in our gradle build/publish?
gradleBuild = Artifactory.newGradleBuild()
gradleBuild.useWrapper = true
gradleBuild.deployer(
repo: env.BRANCH_NAME == 'master' ? 'libs-releases-local' : 'libs-snapshots-local',
server: Artifactory.server('artifactory-global'))
gradleBuild.run tasks: 'clean build artifactoryPublish'
How can we achieve this? Also I would like to pass other parameters like -x test to the run command to skip tests in this stage.
Apparently you can add parameters throug the switches parameter: https://jenkins.io/doc/pipeline/steps/artifactory/
With this you add the necessary parameters like '-x test -Pversion=' + version
For my use case I added a version property to my build.gradle: version = "${version}" so it can be overridden with the command above.
How do I execute another FAKE build script from within a FAKE build script?
I want to create a build script which clones a git repo and then executes a build script within the cloned project. After that it should tag the branch and publish the build artifact to a Nexus Repository Manager.
Also I want to hand over some parameter to the external build script, e.g. a version tag.
Here is some pseudo code:
Target "ReleaseBuild" (fun _ ->
CleanDir "src"
cloneSingleBranch "" "http://<URL to git-repo>" "master" "src"
????? "src/build.fsx" versionNumer
... tagging etc. ...
)
The usual way to modularize your build is not by calling FAKE within FAKE, but by having FSI load modules using the #load directive.
For example:
// build.fsx
#r "path/to/FakeLib.dll"
#load "./OtherModule.fsx"
open Fake
Target "ReleaseBuild" <| fun _ ->
CleanDir "src"
cloneSingleBranch "" "http://<URL to git-repo>" "master" "src"
OtherModule.DoWhatever versionNumer
...
// OtherModule.fsx
let DoWhatever version =
...
.
EDIT (in response to comment)
If the other script file doesn't exist when the first one starts, then you could execute it using FSIHelper.executeFSI:
Target "ReleaseBuild" <| fun _ ->
CleanDir "src"
cloneSingleBranch "" "http://<URL to git-repo>" "master" "src"
let result, messages = FSIHelper.executeFSI "src" "build.fsx" []
...
(or executeFSIWithArgs if you need to pass arguments)
If even this doesn't work for you for some reason, I would just resort to executing it as a regular shell command, using Shell.Exec:
Target "ReleaseBuild" <| fun _ ->
CleanDir "src"
cloneSingleBranch "" "http://<URL to git-repo>" "master" "src"
let errCode = Shell.Exec ("path/to/fake.exe", args = "src/build.fsx")
...
But if you want this to be cross-platform, it gets a bit complicated: *.exe won't execute natively on UNIX, you need to run mono and give the .exe file as argument. To work around that, you'd need to wrap fake.exe in a shell script (two scripts - one for Windows, one for UNIX), and then pass the name of that script as argument into the FAKE script.
I want to switch from MSBuild to FAKE. In my MSBuild script I create a Webdeploy package by invoking MSBuild with the properties DeployOnBuild=True and DeployTarget=Package. This will trigger webdeploy to generate a deployment package while the build is running:
<MSBuild Projects="#(ItemToBuild)"
Targets="Build"
Properties="Configuration=$(Configuration);
Platform=$(Platform);
DeployOnBuild=True;
DeployTarget=Package;
OutFolder=$(OutFolder)" />
How can I do the same thing with FAKE? I've come this far:
Target "Build" (fun _ ->
!! solutionFile
|> MSBuildRelease binDir "Build"
|> Log "Build-Output: "
)
How can I specify the required properties?
If you look at the source code, you'll see that MSBuildRelease is just a shortcut for MSBuild proper with certain predefined properties. If you need to define other properties, besides "Configuration", you can just fall back to MSBuild:
Target "Build" (fun _ ->
!! solutionFile
|> MSBuild binDir "Build"
[
"Configuration", "Release"
"Platform", "AnyCPU"
"DeployOnBuild", "True"
"DeployTarget", "Package"
"OutFolder", "/what/ever"
]
|> Log "Build-Output: "
)
I would like to have a post-build hook or similar, so that I can have the same output as e. g. the IRC plugin, but give that to a script.
I was able to get all the info, except for the actual build status. This just doesn't work, neither as a "Post-build script", "Post-build task", "Parameterized Trigger" aso.
It is possible with some very ugly workarounds, but I wanted to ask, in case someone has a nicer option ... short of writing my own plugin.
It works as mentioned with the Groovy Post-Build Plugin, yet without any extra quoting within the string that gets executed. So I had to put the actual functionality into a shell script, that does a call to curl, which in turn needs quoting for the POST parameters aso.
def result = manager.build.result
def build_number = manager.build.number
def env = manager.build.getEnvironment(manager.listener)
def build_url = env['BUILD_URL']
def build_branch = env['SVN_BRANCH']
def short_branch = ( build_branch =~ /branches\//).replaceFirst("")
def host = env['NODE_NAME']
def svn_rev = env['SVN_REVISION']
def job_name = manager.build.project.getName()
"/usr/local/bin/skypeStagingNotify.sh Deployed ${short_branch} on ${host} - ${result} - ${build_url}".execute()
Use Groovy script in post-build step via Groovy Post-Build plugin. You can then access Jenkins internals via Jenkins Java API. The plugin provides the script with variable manager that can be used to access important parts of the API (see Usage section in the plugin documentation).
For example, here's how you can execute a simple external Python script on Windows and output its result (as well as the build result) to build console:
def command = """cmd /c python -c "for i in range(1,5): print i" """
manager.listener.logger.println command.execute().text
def result = manager.build.result
manager.listener.logger.println "And the result is: ${result}"
For this I really like the Conditional Build Step plugin. It's very flexible, and you can choose which actions to take based on build failure or success. For instance, here's a case where I use conditional build step to send a notification on build failure:
You can also use conditional build step to set an environment variable or write to a log file that you use in subsequent "execute shell" steps. So for instance, you might create a build with three steps: one step to compile code/run tests, another to set a STATUS="failed" environment variable, and then a third step which sends an email like The build finished with a status: ${STATUS}
Really easy solution, maybe not to elegant, but it works!
1: Catch all the build result you want to catch (in this case SUCCESS).
2: Inject an env variable valued with the job status
3: Do the Same for any kind of other status (in this case I catch from abort to unstable)
4: After you'll be able to use the value for whatever you wanna do.. in this case I'm passing it to an ANT script! (Or you can directly load it from ANT as Environment variable...)
Hope it can help!
Groovy script solution:-
Here I am using groovy script plugin to take the build status and setting it to the environmental variable, so the environmental variable can be used in post-build scripts using post-build task plugin.
Groovy script:-
import hudson.EnvVars
import hudson.model.Environment
def build = Thread.currentThread().executable
def result = manager.build.result.toString()
def vars = [BUILD_STATUS: result]
build.environments.add(0, Environment.create(new EnvVars(vars)))
Postscript:-
echo BUILD_STATUS="${BUILD_STATUS}"
Try Post Build Task plugin...
It lets you specify conditions based on the log output...
Basic solution (please don't laugh)
#!/bin/bash
STATUS='Not set'
if [ ! -z $UPSTREAM_BUILD_DIR ];then
ISFAIL=$(ls -l /var/lib/jenkins/jobs/$UPSTREAM_BUILD_DIR/builds | grep "lastFailedBuild\|lastUnsuccessfulBuild" | grep $UPSTREAM_BUILD_NR)
ISSUCCESS=$(ls -l /var/lib/jenkins/jobs/$UPSTREAM_BUILD_DIR/builds | grep "lastSuccessfulBuild\|lastStableBuild" | grep $UPSTREAM_BUILD_NR)
if [ ! -z "$ISFAIL" ];then
echo $ISFAIL
STATUS='FAIL'
elif [ ! -z "$ISSUCCESS" ]
then
STATUS='SUCCESS'
fi
fi
echo $STATUS
where
$UPSTREAM_BUILD_DIR=$JOB_NAME
$UPSTREAM_BUILD_NR=$BUILD_NUMBER
passed from upstream build
Of course "/var/lib/jenkins/jobs/" depends of your jenkins installation