I am currently investigating bazel as a tool to speed up java builds. I have a somewhat complex build to handle, including shading of many libs.
This shading is performed today using maven-shade-plugin. I could not find a bazel equivalent.
The solution should be able to:
aggregate multiple input jars
filter in/out files
specify which artifacts to include
parameter relocations (!)
propose a mechanism equivalent to resource transformers https://maven.apache.org/plugins/maven-shade-plugin/examples/resource-transformers.html
If this is out of reach, I would be very interested in some generic way to specify some input, output and "something" to lauch to produce the later from the former.
Any java_bin has an implicit _deploy.jar that contains all classes and is similar to the shaded jar:
name_deploy.jar: A Java archive suitable for deployment (only built if explicitly requested)
The deploy jar contains all the classes that would be found by a classloader that searched the classpath from the binary's wrapper script from beginning to end.
https://docs.bazel.build/versions/master/be/java.html#java_binary_implicit_outputs
But I don't think bazel provides any of the other features that you are requesting.
Related
I would like to use a very large non-bazel system in a bazel project. Specifically, ROS2. This dependency provides a large number of python, C, and C++ libraries which are built using its own hand-rolled buildsystem. Obviously, I would like to avoid having to translate the entire buildsystem over to bazel.
Broadly, what's the best way of me doing this? In instinct was to use a custom repository rule to download the source (since it's split across many repositories), then use a genrule to call the ROS2 build system. Then write my simple cc_import and py_library rules for each of the individual components that I need.
However, I'm having trouble with the bit where I need to call the foreign build system. It seems that genrules require a list of output files to be specified, while I would like it to make an entire build directory available.
Before I spent any more time on this, I thought I'd ask whether I'm on the right lines since I'm new to bazel. Is this a good strategy? How would you approach this problem? Are there any other projects that mainly use bazel, but call other build systems in this way that I can look at?
As of recent, you can use rules_foreign_cc to call native CMake or make/configure like projects.
I want to build a rule that is very similar to cc_proto_library. The key features are that it would apply an aspect to all the transitive proto_library dependencies and generate .cc and .h files for all of the dependencies. In addition it would generate actions that would compile these into object files.
While I understand how I can do the file generation, I don't see how to easily do the object generation. The native module is not available for rule (or aspect) implementations, and I cannot use a macro on top of the aspect as I need the object files to be generated in the same package as the proto_library so that it is generated only once.
cc_proto_library can do this I believe because it is not written in Skylark and thus has access to more primitives. Is there anyway to do this with just Skylark?
This is unfortunately currently not possible. There is no Skylark API to the C++ rules/actions (what we call C++ sandwich). We have plans to implement this in Q1 2018. There are many tracking issues, this one looks the most relevant: https://github.com/bazelbuild/bazel/issues/2163.
As part of our efforts to create a bazel-maven transition interop tool (that creates maven sized jars from more granular sized bazel jars),
we want the aspect that runs on bazel build to access the target's java_common.provider in order to fetch the jars and ijars from it.
Is that possible?
The short answer is yes, that's possible.
You can use the java_common module in an aspect implementation the same way you would use it in a rule implementation.
From the documentation on java_common.provider:
java_common.provider.compile_jars and java_common.provider.transitive_compile_time_jars refer to the ijars used at compilation time
java_common.provider.transitive_runtime_jars refer to the full jars used at runtime.
The full jars at compilation time are not yet available, but someone is working on exposing this feature. (Issue #3528 on GitHub.)
Make sure you also read the blog post on this topic: https://blog.bazel.build/2017/03/07/java-sandwich.html
Problem
I've been developing a game in C++ in my spare time and I've opted to use Bazel as my build tool since I have never had a ton of luck (or fun) working with make or cmake. I also have dependencies in other languages (python for some of the high level scripting). I'm using glfw for basic window handling and high level graphics support and that works well enough but now comes the problem. I'm uncertain on how I should handle dependencies like glfw in a Bazel world.
For some of my dependencies (like gtest and fruit) I can just reference them in my WORKSPACE file and Bazel handles them automagically but glfw hasn't adopted Bazel. So all of this leads me to ask, what should I do about dependencies that don't use Bazel inside a Bazel project?
Current approach
For many of the simpler dependencies I have, I simply created a new_git_repository entry in my WORKSPACE file and created a BUILD file for the library. This works great until you get to really complicated libraries like glfw that have a number of dependencies on their own.
When building glfw for a Linux machine running X11 you now have a dependency on X11 which would mean adding X11 to my Bazel setup. X11 Comes with its own set of dependencies (the X11 libraries like X11Cursor) and so on.
glfw also tries to provide basic joystick support which is provided by default in Linux which is great! Except that this is provided by the kernel which means that the kernel is also a dependency of my project. Now I shouldn't need anything more than the kernel headers this still seems like a lot to bring in.
Alternative Options
The reason I took the approach I've taken so far is to make the dependencies required to spin up a machine that can successfully build my game very minimal. In theory they just need a C/C++ compiler, Java 8, and Bazel and they're off to the races. This is great since it also means I can create a Docker container that has Bazel installed and do CI/CD really easily.
I could sacrifice this ease and just say that you need to have libraries like glfw installed before attempting to compile the game but that brings the whole which version is installed and how is it all configured problem back up that Bazel is supposed to help solve.
Surely there is a simpler solution and I'm overthinking this?
If the glfw project has no BUILD files, then you have the following options:
Build glfw inside a genrule.
If glfw supports some other build system like make, you could create a genrule that runs the tool. This approach has obvious drawbacks, like the not-to-be-underestimated impracticality of having to declare all inputs of that genrule, but it'd be the simplest way of Bazel'izing glfw.
Pre-build glfw.o and check it into your source tree.
You can create a cc_library rule for it, and put the .o file in the srcs. Even though this solution is the least flexible of all because you not only restrict the target platform to whatever the .o was built for, but also make it harder to reproduce the whole build, the benefits are sometimes worth the costs.
I view this approach as a last resort. Even in Bazel's own source code there's one cc_library.srcs that includes a raw object file, because it was worth it, as the commit message of 92caf38 explains.
Require that glfw be installed.
You already considered this option. Some people may prefer this to the other approaches.
I have few own APIs with around 2000 classes overall. Some of them use the new Path API from JDK7. Most other classes, however, do not rely on any new JDK APIs or new language features. So most classes could be used in a JDK6 environment (which I plan to do). Let's assume, I've annotated all JDK7-only classes with #Java7Only.
What I need now, is a way to create a JDK6-only subset of all my projects more-or-less automatically, without introducing new version branching or product lines (would be too complicated to maintain).
All projects are created using Netbeans, thus using Ant. Many projects depend on others.
Please help me evaluate, which ideas according to my problem is most appropriate. Which problems could occur with each idea?
Common first step for all ideas
Let an annotation processor search for #Java7Only-annotated classes and store the list to a properties file.
Idea 1 (specific)
Write a tool which would use the properties file to recursively copy the whole project, except JDK7-only files.
Build the copied project using JDK6 by invoking ant, thus getting a JDK6-compliant jar.
Idea 2 (specific)
Write a second annotation processor which would use the properties file to pass everything except JDK7-only files to a JavaCompiler instance.
Either build a jar using Java APIs or use Ant API for that.
(This would be a Java-only idea, but probably too complicated)
Idea X (abstract)
Somehow influence the Ant build process (by overwriting some targets?) and for each JDK6-compliant class: let Ant compile two versions of it (one time with JDK6 compiler, another time with JDK7 compiler).
(JDK7-only classes would be compiled only once, using the JDK7 compiler, of course)
Package each bunch to a separate jar.
Possible common problems to the ideas
Some projects dependent on others, so some actions (such as packaging) should consider this.
Remember: the JDK7 compiler generates downward incompatible class files, that's why every possible idea has to happen on sources-level (before or during the build process, not afterwards).
My thoughts on Idea 2:
Essentially this is invoking a compiler within a compiler. Annotation processors are run as part of compilation. Can this be done safely? Is there any static state in Sun's javac that would cause problems. (I don't know the answer but from memory there might be some static state that could cause problems in this scenario).
Idea 1 seems simpler and better to me.
But taking a step back, is it possible to separate out all the JDK 7 specific stuff into a separate module and compile it separately, into a different JAR?
Have the 'main' project, compiled using JDK 6 (which JDK 7 would have no problems reading because it is backwards compatible)
The JDK 7 specific module(s), with source in a different directory, which includes the 'main' JAR on the compilation classpath, could be built separately, with a different build.xml if necessary.
This only partially applies but I'd thought I'd mention it anyway.
The problem with just using -source 1.6 -target 1.6 options for validation is that you can still use Java 7 API when compiled using JDK 7.
I've used the Animal Sniffer Maven Plugin for a few projects now and it has proved quite useful. This plugin scans byte-code of your classes for JDK API usage. That is, you can tell it to fail the build if you attempt to use JDK 7 API when you are targeting JDK 6. This wont help much for separating out classes as you need but it could be useful as a final validation step combined with -source 1.6 -target 1.6 compiler options.
There is also an animal sniffer Ant plugin, as mentioned from the Animal Sniffer main page.