When running my Jenkins build i need to update the contents of a file with a version number in this case. I have come across a plugin called text-file-operations but rather than write a whole new file I thought it would be better to update.
In this example I have a podspec file located in the root of the project which just needs a version number updated with a variable I have created earlier in the process.
spec.version = '13.4.0'
I just need to convert that to
spec.version = "${VERSION_NUMBER}"
Is there a way to do this ?
Is this what you want then?
Groovy + how to append text in file ( new line )
f = new File('<filename>')
f.append("spec.version = ${VERSION_NUMBER}\n")
Related
I've got a Java project I'm converting to Bazel.
As is typical with Java projects, there are property files with placeholders that need to be resolved/substituted at build time.
Some of the values can be hardcoded in a BUILD or BZL file:
BUILD_PROPERTIES = { "pom.version": "1.0.0", "pom.group.id": "com.mygroup"}
Some of the variables are "stamps" (e.g. BUILD_TIMESTAMP, GIT_REVISION, etc): The source for these variables are volatile-status.txt and stable-status.txt
I must generate a POM for publish, so I use #bazel_common//tools/maven:pom_file in BUILD
(assume that I need ALL the values described above for my pom template):
_local_build_properties = {}
_local_build_properties.update(BUILD_PROPERTIES)
# somehow add workspace status properties?
# add / override
_local_build_properties.update({
"pom.project.name": "my-submodule",
"pom.project.description": "My submodule description",
"pom.artifact.id": "my-submodule",
})
# Variable placeholders in the pom template are wrapped with {}
_pom_substitutions = { '{'+k+'}':v for (k,v) in _local_build_properties.items()}
pom_file(
name = "my_submodule_pom",
targets = [
"//my-submodule",
],
template_file = "//:pom_template.xml",
substitutions = _pom_substitutions,
)
So, my questions are:
How do I get key-value pairs from volatile/stable -status.txt into the
dictionary I need for pom_file.substitutions?
pom_file depends on java_library so that it can write its dependencies
into the POM. How do I update the jar generated by java_library with the
pom?
Once I have the pom and the updated jar containing the pom, how do I publish to a Maven repo?
When I look at existing code, for example rules_docker, it seems that the implementation always bails to a local executable (shell | python | go) to do the real work of substitution, jar manipulation and image publication. Am I trying to do too much in BUILD and BZL files? Should I be thinking, "Ultimately, what do I need to pass to local shell/python/go scripts to get real build work done?
(Answered on bazel-discuss group)
Hi,
You can't get these values from Starlark. You need a genrule to read the stable/volatile files and do the substitutions using an
external tool like 'sed'.
A file cannot be both an input and output of an action, i.e. you can't update the .jar from which you generate the pom. The action has
to produce a new .jar file.
I don't know -- how would you publish outside of Bazel, is there a tool to do so? Can you write a genrule / Starlark rule to wrap this
tool?
Cheers, László
Starting with Bazel v0.19, if you have Starlark (formerly known as "Skylark") code that references #bazel_tools//tools/jdk:jar, you see messages like this at build time:
WARNING: <trimmed-path>/external/bazel_tools/tools/jdk/BUILD:79:1: in alias rule #bazel_tools//tools/jdk:jar: target '#bazel_tools//tools/jdk:jar' depends on deprecated target '#local_jdk//:jar': Don't depend on targets in the JDK workspace; use #bazel_tools//tools/jdk:current_java_runtime instead (see https://github.com/bazelbuild/bazel/issues/5594)
I think I could make things work with #bazel_tools//tools/jdk:current_java_runtime if I wanted access to the java command, but I'm not sure what I'd need to do to get the jar tool to work. The contents of the linked GitHub issue didn't seem to address this particular problem.
I stumbled across a commit to Bazel that makes a similar adjustment to the Starlark java rules. It uses the following pattern: (I've edited the code somewhat)
# in the rule attrs:
"_jdk": attr.label(
default = Label("//tools/jdk:current_java_runtime"),
providers = [java_common.JavaRuntimeInfo],
),
# then in the rule implementation:
java_runtime = ctx.attr._jdk[java_common.JavaRuntimeInfo]
jar_path = "%s/bin/jar" % java_runtime.java_home
ctx.action(
inputs = ctx.files._jdk + other inputs,
outputs = [deploy_jar],
command = "%s cmf %s" % (jar_path, input_files),
)
Additionally, java is available at str(java_runtime.java_executable_exec_path) and javac at "%s/bin/javac" % java_runtime.java_home.
See also, a pull request with a simpler example.
Because my reference to the jar tool is inside a genrule within top-level macro, rather than a rule, I was unable to use the approach from Rodrigo's answer. I instead explicitly referenced the current_java_runtime toolchain and was then able to use the JAVABASE make variable as the base path for the jar tool.
native.genrule(
name = genjar_rule,
srcs = [<rules that create files being jar'd>],
cmd = "some_script.sh $(JAVABASE)/bin/jar $# $(SRCS)",
tools = ["some_script.sh", "#bazel_tools//tools/jdk:current_java_runtime"],
toolchains = ["#bazel_tools//tools/jdk:current_java_runtime"],
outs = [<some outputs>]
)
I made a repository for glfw with this:
load("#bazel_tools//tools/build_defs/repo:git.bzl", "new_git_repository")
new_git_repository(
name = "glfw",
build_file = "BUILD.glfw",
remote = "https://github.com/glfw/glfw.git",
tag = "3.2.1",
)
I put BUILD.glfw in the WORKSPACE root. When I built, I saw:
no such package '#glfw//': Not a regular file: [snipped/external/BUILD.glfw
I moved BUILD.glfw to external/BUILD.glfw and it seems to work, but I couldn't find documentation about this. The docs about new_git_repository say that build_file "...is a label relative to the main workspace."; I don't see anything about 'external' there.
This is due to an inconsistent semantical difference between the native and (newer) Skylark versions of new_git_repository. To use the native new_git_repository, comment/remove the load statement:
# load("#bazel_tools//tools/build_defs/repo:git.bzl", "new_git_repository")
Assuming that new_git_repository has the same problem that http_archive has, per Bazel issue 6225 you need to refer to the BUILD file for glfw as #//:BUILD.glfw
I have a file "settings.ini" which needs to reside next to the Qt executable.
I can add a custom build step for this in Qt Creator which calls something like this:
copy %{sourceDir}/settings.ini %{buildDir}/settings.ini
This works great so far, but I'd like to include this in the *.pro file so I can put this up in our SVN too.
How can I do this using qmake/.pro-files only?
To copy %{sourceDir}/settings.ini to the build directory without requiring to call make install use:
copydata.commands = $(COPY_DIR) $$PWD/settings.ini $$OUT_PWD
first.depends = $(first) copydata
export(first.depends)
export(copydata.commands)
QMAKE_EXTRA_TARGETS += first copydata
$$PWD is the path of current .pro file. If your settings.ini file is not located in the same directory than the project file, then use something like $$PWD/more_dirs_here/settings.ini
Note: I found this solution here. I recommend to read the whole article as it explains how it works.
You probably want to use the INSTALLS keyword in QMake. It will require you to run make install after your build, but it does work cross-platform.
install_it.path = %{buildDir}
install_it.files += %{sourceDir}/settings.ini
INSTALLS += install_it
for osx bundles you can handle it this way
see Resource files in OS X bundle
add this to you project file:
APP_QML_FILES.files = path/to/file1.qml path/to/file2.qml
APP_QML_FILES.path = Contents/Resources
QMAKE_BUNDLE_DATA += APP_QML_FILES
this example copies the files to Contents/Resources
Compatible with Windows and Mac OSX Dev environments:
Change {AppName} to respective application name
# Define mac/windows specific target dirs
TARGETDIR = ''
macx {
TARGETDIR += $$OUT_PWD/{AppName}.app/Contents/MacOS/
}
else {
TARGETDIR += $$OUT_PWD
}
# Directories do not exist for the first build
# Without mkdata, build is successful after 5 tries. To avoid, use mkdata
mkdata.commands = $(MKDIR) $${TARGETDIR}
copydata.commands = $(COPY_FILE) $$PWD/settings.ini $${TARGETDIR}
first.depends = $(first) mkdata copydata
export(first.depends)
export(mkdata.commands)
export(copydata.commands)
QMAKE_EXTRA_TARGETS += first mkdata copydata
Happy to add Unix support if someone posts Unix solution in the comments.
I am using Jenkins (Hudson) with the Grails plugin to do builds as I update svn. I found this example script that allows you to incorporate a build number from an env var:
set-version 1.1.0.${env['BUILD_NUMBER']}
But as you see, the prefix is hard-coded. I'd like to use the version number set in the application.properties file. How can I do something like:
set-version ${app.version}.${env['BUILD_NUMBER']}
Have tried a variety of scopes/syntax to no avail.
It is not possible out of the box. Jenkins or specifically the Grails plugin does not read the content of the application.properties file and hence the existing application version is not available as a variable.
You might want to consider writing a custom script in your application (like append-version) that will read application.properties and append the value passed in. You can call the existing set-version script modifying the argument.
I have also created a simple script that should do the job:
includeTargets << grailsScript("Init")
target(main: "Append a string to the existing version number") {
depends(checkVersion, parseArguments)
def newVersion = metadata.'app.version' + '-' + args
metadata.'app.version' = newVersion
metadata.persist()
}
setDefaultTarget(main)