I have a Java project that has multiple subprojects. It currently uses gradle however we are now trying to move to Bazel.
How can I create a WAR file using Bazel build?
Could not find any example online.
The only thing I found is this repo:
https://github.com/bmuschko/rules_java_war
However it hasn't had any activity in over 3 years. I am not sure if it is still valid.
In Bazel, you can create a WAR (Web Application Archive) file by defining a war target in your BUILD file. Here are the steps to create a WAR in Bazel:
Define a Java library target: If your WAR project contains Java code, you will need to define a Java library target in your BUILD file. This target specifies the location of your Java code and its dependencies.
java_library(
name = "my_java_library",
srcs = glob(["src/main/java/**/*.java"]),
deps = [ "//third_party/library:library", ],
)
Define a filegroup target: If your WAR project contains any web application resources (such as HTML, JavaScript, and CSS files), you will need to define a filegroup target in your BUILD file. This target specifies the location of your web application resources.
filegroup(
name = "my_web_resources",
srcs = glob(["src/main/webapp/**/*"]),
)
Define a war target: Finally, you will need to define a war target in your BUILD file. This target specifies the location of your Java library and web application resources and creates the WAR file.
war(
name = "my_war_file",
libs = [":my_java_library"],
resources = [":my_web_resources"],
webxml = "src/main/webapp/WEB-INF/web.xml",
)
These are the basic steps for creating a WAR in Bazel. You can find additional information and best practices for creating WAR files in Bazel in the Bazel documentation. Note that the exact steps for creating a WAR in Bazel will depend on the specific architecture and technology stack of your project.
Related
I'm currently importing grpc as an external http_archive in a Bazel C++ project. I would like to build with the flag --config=dbg, as specified in the project's bazel.rc file, here, but just for this dependency. Is there any way for me to do this without downloading the repository and editing the internal bazel build files?
Configurations in bazel.rc files in external dependencies are not automatically applied, see:
How does tools/bazel.rc work with external Workspace dependencies?
As mentioned there, you can copy those configurations to your project's tools/bazel.rc file
How do you enumerate and copy multiple files to the source folder in Bazel?
I'm new to Bazel and I am trying to replace a non-Bazel build step that is effectively cp -R with an idiomatic Bazel solution. Concrete use cases are:
copying .proto files to a a sub-project where they will be picked up by a non-Bazel build system. There are N .proto files in N Bazel packages, all in one protos/ directory of the repository.
copying numerous .gotmpl template files to a different folder where they can be picked up in a docker volume for a local docker-compose development environment. There are M template files in one Bazel package in a small folder hierarchy. Example code below.
Copy those same .gotmpl files to a gitops-type repo for a remote terraform to send to prod.
All sources are regular, checked in files in places where Bazel can enumerate them. All target directories are also Bazel packages. I want to write to the source folder, not just to bazel-bin, so other non-Bazel tools can see the output files.
Currently when adding a template file or a proto package, a script must be run outside of bazel to pick up that new file and add it to a generated .bzl file, or perform operations completely outside of Bazel. I would like to eliminate this step to move closer to having one true build command.
I could accomplish this with symlinks but it still has an error-prone manual step for the .proto files and it would be nice to gain the option to manipulate the files programmatically in Bazel in the build.
Some solutions I've looked into and hit dead ends:
glob seems to be relative to current package and I don't see how it can be exported since it needs to be called from BUILD. A filegroup solves the export issue but doesn't seem to allow enumeration of the underlying files in a way that a bazel run target can take as input.
Rules like cc_library that happily input globs as srcs are built into the Bazel source code, not written in Starlark
genquery and aspects seem to have powerful meta-capabilities but I can't see how to actually accomplish this task with them.
The "bazel can write to the source folder" pattern and write_source_files from aspect-build/bazel-lib might be great if I could programmatically generate the files parameter.
Here is the template example which is the simpler case. This was my latest experiment to bazel-ify cp -R. I want to express src/templates/update_templates_bzl.py in Bazel.
src/templates/BUILD:
# [...]
exports_files(glob(["**/*.gotmpl"]))
# [...]
src/templates/update_templates_bzl.py:
#!/usr/bin/env python
from pathlib import Path
parent = Path(__file__).parent
template_files = [str(f.relative_to(parent)) for f in list(parent.glob('**/*.gotmpl'))]
as_python = repr(template_files).replace(",", ",\n ")
target_bzl = Path(__file__).parent / "templates.bzl"
target_bzl.write_text(f""""Generated template list from {Path(__file__).relative_to(parent)}"
TEMPLATES = {as_python}""")
src/templates/copy_templates.bzl
"""Utility for working with this list of template files"""
load("#aspect_bazel_lib//lib:write_source_files.bzl", "write_source_files")
load("templates.bzl", "TEMPLATES")
def copy_templates(name, prefix):
files = {
"%s/%s" % (prefix, f) : "//src/templates:%s" % f for f in TEMPLATES
}
write_source_files(
name = name,
files = files,
visibility = ["//visibility:public"],
)
other/module:
load("//src/templates:copy_templates.bzl", "copy_templates")
copy_templates(
name = "write_template_files",
prefix = "path/to/gitops/repo/templates",
)
One possible method to do this would be to use google/bazel_rules_install.
As mentioned in the project README.md you need to add the following to your WORKSPACE file;
# file: WORKSPACE
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "com_github_google_rules_install",
urls = ["https://github.com/google/bazel_rules_install/releases/download/0.3/bazel_rules_install-0.3.tar.gz"],
sha256 = "ea2a9f94fed090859589ac851af3a1c6034c5f333804f044f8f094257c33bdb3",
strip_prefix = "bazel_rules_install-0.3",
)
load("#com_github_google_rules_install//:deps.bzl", "install_rules_dependencies")
install_rules_dependencies()
load("#com_github_google_rules_install//:setup.bzl", "install_rules_setup")
install_rules_setup()
Then in your src/templates directory you can add the following to bundle all your templates into one target.
# file: src/templates/BUILD.bazel
load("#com_github_google_rules_install//installer:def.bzl", "installer")
installer(
name = "install_templates",
data = glob(["**/*.gotmpl"]),
)
Then you can use the installer to install into your chosen directory like so.
bazel run //src/templates:install_templates -- path/to/gitops/repo/templates
It's also worth checking out bazelbuild/rules_docker for building your development environments using only Bazel.
I have main project that implements bazel rule and subproject that simulates end user experience. In subproject I'd like to load archive that is created in the main project, as if it is loaded using http_archive. Here's repo setup example:
root/
|- WORKSPACE
|- BUILD
|- rules.bzl
\- integration-tests/
|- WORKSPACE
|- BUILD
The root/BUILD file has :release target which creates tar.gz file. I would like to load this file inside integration-tests/WORKSPACE as if it is loaded using http_archive. Is there a way to do this?
Simplest way I've found is to use:
http_archive(
urls = [
"file://path_to_archive",
],
)
Is there a way to do this?
Strictly speaking, no, there is not a way to do this. Repository rules like http_archive are executed during the loading phase of a build, while build outputs are created during the execution phase. A repository rule cannot depend on a build target, since that target won't have been built yet.
This is true even and especially across workspace boundaries. There is no way for your sub-project's WORKSPACE to directly depend on a build target from the parent project, or any other project.
In this case, I'd think about whether you actually need to load the release tarball in the WORKSPACE. Is the tarball actually a Bazel repository? If so, you might want to look into techniques for testing Bazel extensions.
You can override dependencies on the command line, and use a local checkout of a repository instead of the http_archive.
For example, given the following WORKSPACE.bazel
workspace(name = "cli_cpp")
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "qpid-cpp",
build_file_content = all_content,
strip_prefix = "qpid-cpp-main",
url = "https://github.com/apache/qpid-cpp/archive/main.zip",
)
You can checkout the qpid-cpp repository, or download and unzip that archive, make your changes in it, add an empty WORKSPACE.bazel file and also add the following BUILD.bazel there
filegroup(name = "all", srcs = glob(["**/*"]), visibility = ["//visibility:public"])
Now, run Bazel with
--override_repository=qpid-cpp=/path/to/qpid-cpp
How do I reference a dependency by convention within my project build path? Allow me to elaborate...
I'm using the Groovy/Grails Tool Suite (GGTS 3.0). I'm referencing the dependency as such in my BuildConfig.groovy:
dependencies {
compile 'tgt:jaxb-utils:1.0'
}
The referenced jar file is successfully pulled down from the Artifactory repo - I can find the jar file on my local file system in the Ivy cache. When I run any grails targets (grails compile, grails run-app, grails run-tests), it works fine - meaning my code that imports the classes from the jar has no problems. When I run grails war, the referenced jar file is packed into the constructed war.
However, and here is the crux of this post - the project build path does not reference this jar file by default or convention, so my java or groovy code that imports the classes from the jar file reports a unable to resolve class ... error.
One solution is to simply add the jar file to the lib folder and update the build path accordingly, or modify the build path to reference the jar file in the Ivy cache folder. However, I'd have to do this for any/all dependencies pulled down in this way. Any jars in the lib folder will get saved to the team source code repository, and that seems like wasted space since the grails commands work fine with just the dependency reference in BuildConfig.groovy.
Am I just being too idealistic (ie - difficult) here, or is there a better way to clean up the unable to resolve class ... errors in my IDE without having to manually add the dependent jar files to my lib folder and update my build path? Ideas?
Eclipse / STS / GGTS : If you have Grails plugin installed, you can do the following :
Right click on your project -> Grails Tools -> Refresh dependencies (or shortcut "Alt+G, R")
I have a Java project that is dependent on other Java projects that are siblings and there is a chain of dependencies. Each individual project has a build script written in Ant. For clarity find below a sample of the same.
EARProject depends on WebProject and EJBProject: The war file that is generated by the WebProject build and jar file that is generated by the EJBProject are needed to build the EARProject.
WebProject depends on ComponentOneProject: The jar file that is generated by the ComponentOneProject build is needed to build WebProject.
EJBProject depends on ComponentTwoProject: The jar file that is generated by the ComponentTwoProject build is needed to build EJBProject.
So, when I build the EARProject build, if the dependent war and jar have not been built yet, then it should kick-off the WebProject build and EJBProject build and if the ComponentOneProject is yet to be built, the build of ComponentOneProject needs to be kicked-off and so on.
Can someone suggest a clean method by which we can accomplish this?
Facing the same problem we at our company wrote a custom Groovy script that explores the full dependency tree ant generates the Ant build scripts based on all the .project, .classpath, .settings/* files. This wasn't as difficult as it might seem as first. This way we can build our products without (My)Eclipse on a clean CVS+JDK+Groovy virtual machine. Hope it helps..