We have a file set in our Ant build which looks like this:
<fileset dir="somewhere/lib" includesfile="third-party-jars.txt"/>
Every jar is then spelled out in full in the text file. (The reason we have it in a separate text file is actually nothing to do with build file brevity, but rather so that it's easier to inspect the list from an integration test we have which checks that all third-party jar file licences are documented.)
Someone made a commit which prevented a file going into the build. Ant didn't pick up the missing file at all - the build proceeded. It seems bad that even though the file names are spelled out in full, a file being missing isn't detected... so I'm trying to solve that now.
There is an attribute for erroronmissingdir, but adding it doesn't help (the dir isn't missing.) Is there a way to get an error on missing files in a fileset? Ideally a compact way... because we have more than one of these filesets and duplicating code doesn't sound appealing.
Other people have asked simpler variants of this where they wanted to check a single file. To check a single file, you use the available task. It seems like available only supports one resource though.
I took a shot at restricting the resource collection to try and find available vs. unavailable resources.
<restrict id="temp.available"><resources refid="#{ref-name}"/></restrict>
<difference id="temp.unavailable">
<resources refid="#{ref-name}"/>
<resources refid="temp.available"/>
</difference>
<fail message="Missing stuff">
<condition>
<resourcecount refid="temp.unavailable" when="greater" count="0"/>
</condition>
</fail>
This doesn't work, seemingly because the resource collection is already missing the items which don't exist.
What you are asking for is impossible to get with the native fileset support. fileset works as a scanner: it is based on a particular directory, and includes all files according to the inclusion/exclusion patterns you provide. Even erroronmissingdir wasn't a part of fileset until Ant 1.7 or so.
Your best bet is to write a macro around the available task, passing a resource collection to the macro and use a for loop (from Ant-Contrib) to test the availability of each file.
Related
I would like a set of rules from my_package.bzl to be accessible to all BUILD files of a workspace without having to load my_package.bzl in the BUILD files. Basically I want the rules in the package to look like native rules. How can I achieve this?
I was thinking maybe there's a line I could add to one of the .bazelrcs or to the WORKSPACE file of the the project.
This can be achieved by adding a prelude_bazel file at //tools/build_rules:prelude_bazel (this must be a package, so tools/build_rules must contain a BUILD file).
This will be loaded and prepended to all BUILD files loaded by Bazel.
However, there are a few things to consider before going this route. It's currently undocumented, and while doing some searching to find any info on this feature, it's unclear if it will remain a part of Bazel.
It may also have performance / scaling problems. If the prelude were to change (or any of its dependencies), every BUILD file would have to be reloaded, and this may take some time depending on the size of the build graph.
On my way to migrate an existing build to bazel, i have a submodule mod1 that has some JUnit tests reading files from a "testdata" directory. When trying to load those files, i have to use "mod1/testdata/test.txt" instead of "testdata/test.txt", i.e. the unit tests have to be aware of their corresponding bazel module directory.
(1) Is this the correct behaviour for bazel 0.23.2#debian and 0.23.2-homebrew?
(2) Is there a way to use the .java tests without changes, and to remove the need for a "mod1" prefix in bazel data/ runfiles?
My sample project is here: https://gitlab.com/jhinrichsen/bazel-data-test. I am looking for a way to use the same path "testdata/test.txt" for both root module and submodule. In my example project, bazel test AllTests suceeds, while bazel test mod1/AllTests fails because i need to prepend "mod1/" to "testdata/test.txt".
Not looking for a resources/classpath based solution as i cannot modify the existing test sources.
The behavior that you are seeing is indeed the correct behavior, and there is no way to strip the "mod1" prefix with the native Java rules. Anything you include with data will be scoped to its own package in the way you're seeing.
The reason for this is pretty straightforward. Let's say that your test target, //mod1:AllTests, also depended on a hypothetical //mod2:tests library. And let's say that hypothetical library also had a testdata/test.txt as a data dependency. The multiple test.txt files would conflict unless they were namespaced to their packages.
If you absolutely cannot modify the test source at all, then you are pretty much stuck. Here's a previous discussion about this:
https://groups.google.com/forum/#!topic/bazel-discuss/w6TDwSZvN0k
I would recommend if you're trying to work with Bazel, you accept the concept of runfiles and modify your tests to either work with the runfiles structure, or accept a command-line argument for where to find the test data.
Is there a way to specify optional dependencies in Bazel?
I'd like to make a rule to somewhat mirror Kitware's ExternalData, but I would like to see if I can enable workflows where the developer edits the file in-tree, ideally without needing to modify the BUILD file.
Ideal Workflow
Define a rule, external_data, which can fetch a file from a given server given its SHA-512.
If the file already exists, check it's SHA-512.
If that is what is requested, symlink / copy this file (ensuring that no tests can modify the original file).
If it is different, print a warning, but proceed as normal, to allow for developers to quickly modify the large files as they need.
I would like to do this such that Bazel can switch between the file being present and not, and be robust to false-positives on caching. An example scenario that I would like to avoid, if I were to not include it as an optional dependency:
In a prior run, the file was in the workspace, Bazel built the target, everything's fine and dandy.
Developer removes the file from the workspace after uploading, satisfied with their changes and wanting to test the download process.
When running the downstream target, Bazel doesn't care about the change in the workspace since it's not an explicit dependency, and the symlink is invalidated, and the test crashes and burns.
To me, it seems like I'd run into this if I tried to implement a repository_rule rule which manually checks for the file existence, and conditionally executes (I'm not sure if analysis would retrigger this rule being "evaluated" if Step 2 happens.).
Workaround
My current thought for an alternative workflow is to have an explicit option for external_data, use_workspace: if False, it will download the file; if True, it will just mirror exports_files([]). The developer can then set this when modifying files.
(Ideally, I'd like to optionally include a file which indicates the SHA (${file}.sha512), but this seems to go back to the original ask.)
One workaround is to use Bazel's glob(...) method to effectively check for file existence.
If you have a file, say basic.bin.sha512, and you want a rule to switch modes based on that file's existence, you can use glob(["basic.bin.sha512"]), which will either match the package file exactly or return an empty list.
I had tinkered around with using this on larger sets of files, and it appears to work. However, for the time being, I've erred to having a sort-of explicit "development" mode for the target definition to keep the Bazel build relatively consistent, regardless of what files may be checked out.
Here's an example usage:
https://github.com/EricCousineau-TRI/external_data_bazel/blob/4bf1dff/WORKFLOWS.md#edit-files-in-a-sha512-group
I am preparing a release for an application using rebar, and I wonder what is the usual way to include header file from standard library. In my case, it is the wx.hrl file, which is included with its full absolute path in my code.
I guess that it is not the right way :o)
-include_lib("wx/include/wx.hrl").
This makes the preprocessor look for the latest version of the wx application in the code path. See this question for more details.
J2ME lacks the java.util.Properties class. Although it is possible to put application settings in the JAD file this is not recommended for many properties. (Since, some platforms limits the size of JAD file.) I want to put a configuration file inside my jar file and parse it. And I do not want to go with XML because it will be overshooting for my case.
Question is, is there an already existing library for J2ME that can parse properties files or something similar such as INI file. Or would you recommend another method to solve the initial problem?
The best solution probably depends on what is going to be generating the properties files.
If you've got other non-JavaME projects using the same properties files, then stick with them, and write or find a parser. (There is a simple one from GoBible available on Google Code)
However you might find it just as easy to keep your configuration as static final String myproperty="myvalue"; in a Configuration.java file which you compile, and include in the jar instead, since you then do not need any special code to locate, open, read, and parse them.
You do then pick up a limitation on what you call them though, since you can no longer use the common dot separated namespacing idiom.