Is there an ant equivalent to make generic rules? - ant

Is there an Apache Ant equivalent to make rules of the form:
%.object: %.source
${TRANSFORMER} $< -o $#
This is a pattern that I find myself using frequently and really miss from make. It would be even better if there is an equivalent that can detect changes to the source files similar to the way make does.

Related

build research and use it as an external package for research project

I want to perform some research regarding quantization/sparsification, I would like to use run_experiment.py script as a template, to do so in a clean matter as research is not part of the pip package I was wondering if it is possible to build it myself and then reuse it as a dependency (as in run_experiment.py some functions from research are used). I am not sure however how to do it. I am not familiar with bazel. I was able to install it and run the script, that's all. Any guidance would be highly appreciated! Or if it's not possible it would be good to know as well! Thank you for any advice in this matter.
EDIT:
I built something using bazel and I have it in bazel-bin I don't know now however how to reuse it in my script, as if I just wanted to do it in a python manner
from research.compression import compression_process_adapter
or somehthing similar in my script
Using TFF for Federated Learning Research gives a rough introduction on suggestions for organizing the experiment conceptually.
From there, seeing how "run scripts" are setup in various sub-directories under tensorflow_federated/python/research/ might be good examples. If there is an sub directory that is close to what you want to accomplish, forking/copying it might be a good place to start.
For instance, tensorflow_federated/python/research/gans/experiments/emnist/run_experiments.py might be a useful example for how to setup an experiment grid. This iteratively runs tensorflow_federated/python/research/gans/experiments/emnist/train.py, which has an example of how to import libraries under the research/ directory. Note that all of these uses bazel, and the depedencies for the imports are decalred in the tensorflow_federated/python/research/gans/experiments/emnist/BUILD file.
Finally, this script can be run with (from the git repo root directory):
bazel run -c opt tensorflow_federated/python/research/gans/experiments/emnist:run_experiments

Using large non-bazel dependencies in a bazel project

I would like to use a very large non-bazel system in a bazel project. Specifically, ROS2. This dependency provides a large number of python, C, and C++ libraries which are built using its own hand-rolled buildsystem. Obviously, I would like to avoid having to translate the entire buildsystem over to bazel.
Broadly, what's the best way of me doing this? In instinct was to use a custom repository rule to download the source (since it's split across many repositories), then use a genrule to call the ROS2 build system. Then write my simple cc_import and py_library rules for each of the individual components that I need.
However, I'm having trouble with the bit where I need to call the foreign build system. It seems that genrules require a list of output files to be specified, while I would like it to make an entire build directory available.
Before I spent any more time on this, I thought I'd ask whether I'm on the right lines since I'm new to bazel. Is this a good strategy? How would you approach this problem? Are there any other projects that mainly use bazel, but call other build systems in this way that I can look at?
As of recent, you can use rules_foreign_cc to call native CMake or make/configure like projects.

How to re-use existing CMake variables with new generator

I need to build OpenCV for both 32-bit and 64-bit in VS2015.
I'm aware that I need a separate build tree for each generator.
OpenCV's CMake configuration has approximately 300 user-configurable variables, which I have finally got set to my satisfaction. Now I want to use the exact same set of decisions to build the 64-bit version.
Is there a way to transfer the variable values that represent my decisions to the new build tree? (Other than opening two CMake-GUIs side by side and checking that all ~300 values correspond.)
BTW, if the generator is changed, CMakeCache.txt must be deleted, according to the CMake mailing list [ http://cmake.3232098.n2.nabble.com/Changing-the-the-current-generator-in-CMake-GUI-td7587876.html ]. Manually editing it is very risky and will likely lead to undefined behaviour.
Thanks
Turning my comment into an answer
You can use a partial CMakeCache.txt in the new directory (CMake will just pre-load the values that are there and reevaluate the rest).
So you can use a grep like approach and do
findstr "OpenCV_" CMakeCache.txt > \My\New\Path\CMakeCache.txt
Just tested it and seems to work as expected.
Reference
What are good grep tools for Windows?

ant build.xml in SCons

I am making use of a library project which uses ant to build. My project however is using SCons because I need a far more complex build setup. Now I would like to use ant via SCons but NOT impose the problematic CLASSPATH issues and installation that ant requires.
So I am currently thinking of writing a build.xml parser, which turns the ant into SCons tasks.
Does anyone know whether this has been done before?
As far as I can tell there is no such parser in existence, which I partly believe is because there is great difference in how SCons and ant work. Especially when it comes to dependency resolution. It should be possible, but the translated file output will be very little SCons like, quite unreadable and probably quite difficult to maintain. Which pretty much defeats the whole reason to use SCons in the first place.
Since the library already uses ant, it would probably be a good idea to just incorporate the running of ant into SCons. If SCons can use ant, then you won't have to maintain the library build script (unless it is you that maintain the ant also)
Have you seen this: http://geosoft.no/development/android.html? We're also looking at converting an ANT based android build into our over-arching SCONS build and this looks like a good starting point.

techniques for parsing interactive input(like a shell)?

I'm working on a program that can be used directly from the command line with options and input files, or entirely interactively like a shell. For the initial execution I'm using GNU's Getopt to parse the command line options.
When being used on a file I'm using Flex and Bison. This simplifies the parsing greatly since the grammar is very simple, but I'm not entirely sure how I should tackle the shell aspect. I have used GNU's readline and history libraries before, when then I did this I relied solely on strtok and many comparisons with nested switch statements. It worked but it seemed kind of like a hack-job too me...
Is there a better way to approach this problem?
For the data input that the shell would allow I was thinking about piping it directly to a temp file and using Flex and Bison again, but for various parameters(like the command line options that Getopt is parsing for me now) is there a better way?
I was toying around with the idea of trying to recycle my getopt code, since its flexible to capture everything and if its not a option I could assume its data and pipe it out. But I'd love a 2nd opinion.
Thanks
Just write it in Python. Use the cmd module to write the shell program and use shlex for parsing input just like the shell.

Resources