I want to pick up certificates from a different paths and add them to the final zip file. I have a xml file which has the location of all the certs. I need maven to read from this xml file and pick up the certs. I am struck here.
As reading files is generally not an intended maven use case, getting the cert into your final artifact will be cumbersome.
I see mainly 2 options for you.
First, use the gmaven-plugin and execute a Groovy script to copy your files to an appropriate place ( somewhere into the target folder i guess).
This anwser on SO provides hints how to do that.
An alternative would be to pack this process into your own maven-plugin, which is less difficult than it sounds. The advantage would be, that the copy logic is sealed in the plugin (in contrast to openly accessible Groovy script in the pom) and is therefor more maintainable and secure.
Related
After downloading an archive throug http_archive I'd like to run a script to generate a BUILD file from the folder structure and Cmake files in it (I currently do that by hand and it is easy enough that it could be scripted). I don't find anything on how to open, read and write files in the starlark documentation but since http_archive itself is loaded from a bzl file (haven't found the source of that file yet though...) and generates BUILD files (by unpacking them from archives) I guess it must be possible to write a wrapper for http_archive that also generates the BUILD file?
This is a perfect use case for a custom repository rule. That lets you run arbitrary commands to generate the files for the repository, along with some helpers for common operations like downloading a file over HTTP using the repository cache (if configured). A repository rule conceptually similar to a normal rule, but with much less infrastructure because it's running during the loading phase when most of the Bazel infrastructure doesn't apply yet.
The starlark implementation of http_archive is in http.bzl. The core of it is a single call to ctx.download_and_extract. Your custom rule should do that too. http_archive then calls workspace_and_buildfile and patch from util.bzl, which do what they sound like. Instead of workspace_and_buildfile, you should call ctx.execute to run your command to generate the BUILD file. You could call patch if you want, or skip that functionality if you're not going to use it.
The repository_ctx page in the documentation is the top-level reference for everything your repository rule's implementation function can do, if you want to extend it further.
When using http_archive, you can use the build_file argument to create a BUILD file. To generate it dynamically, I think you can use the patch_cmds argument to run external commands.
So, at work, I frequently have to create virtually identical ant scripts. Basically the application we provide to our clients is designed to be easily extensible, and we offer a service of designing and creating custom modules for it. Because of the complexity of our application, with lots of cross dependencies, I tend to develop the module within our core dev environment, compile it using IntelliJ, and then run a basic ant script that does the following tasks:
1) Clean build directory
2) Create build directory and directory hierarchy based on package paths.
3) Copy class files (and source files to a separate sources directory).
4) Jar it up.
The thing is, to do this I need to go through the script line by line and change a bunch of property names, so it works for the new use case. I also save all the scripts in case I need to go back to them.
This isn't the worst thing in the world, but I'm always looking for a better way to do things. Hence my idea:
For each specific implementation I would provide an ant script (or other file) of just properties. Key-value pairs, which would have specific prefixes for each key based on what it's used for. I would then want my ant script to run the various tasks, executing each one for the key-value pairs that are appropriate.
For example, copying the class files. I would have a property with a name like "classFile.filePath". I would want the script to call the task for every property it detects that starts with "classFile...".
Honestly, from my current research so far, I'm not confident that this is possible. But... I'm super stubborn, and always looking for new creative options. So, what options do I have? Or are there none?
It's possible to dynamically generate ANT scripts, for example the following does this using an XML input file:
Use pure Ant to search if list of files exists and take action based on condition
Personally I would always try and avoid this level of complexity. Ant is not a programming language.
Looking at what you're trying to achieve it does appear you could benefit from packaging your dependencies as jars and using a Maven repository manager like Nexus or Artifactory for storage. This would simplify each sub-project build. When building projects that depend on these published libraries you can use a dependency management tool like Apache ivy to download them.
Hope that helps your question is fairly broad.
In a Struts-2 Application, there are many Action classes and action-validation.xml files
for each Action classes how can I organize all these validation xml files into a single
folder such as in a "properties folder" and access validation rules from there.
A lot of it depends on what build system you are using. If you are using Maven to build, for example, you can place the validation.xml files in an identically named package inside of src/main/resources instead of src/main/java. In Ant, you could probably set it up anywhere and have Ant copy the xml files to the correct place in the final war file, though I am not sure how, not really having used Ant much.
I think the key thing is that in the final packaged war, the xml files need to be in the same place as the class files for that action. Where they are before build time doesn't so much matter.
I would like to simplify my main build scripts, and I'd like to have the ability to reuse certain common ant tasks as a library, but I'd also like them to be easily available as a package.
For instance, I've got a task which sets up the Flex environment variables that I need to build a variety of projects. I can (And am currently) include those scripts by relative path from another location in source control. But what I want to do is make a single download-able package that I can grab via Ivy that contains all of these generic tasks.
A jar seems the most natural solution, since this is doable from java (Use the class loader to access the file inside the jar.), but I can't seem to find a "native" way in Ant to just get the xml file.
In short, I want to do:
<import file="some.jar!bootstrap.xml">
But that doesn't work.
Is there someway to do this? Any other suggestions for making a library of ant scripts would be much appreciated as well.
From what I understand you're trying to extract a file containing more ant tasks from your jar and then tell ant to execute the tasks in those extracted files. Since the files are static, you'd probably be better off creating actual java Task definitions in your jar and declaring them in your ant build file. However, if you don't want to do that, you can just use the Unzip ant task to extract the resource out of the jar and onto the file system and then use the Ant task to execute the extracted file.
IIRC there's ongoing work in Ant to support this but it's not supported in any published version.
For those of you that use Ant with multiple projects, where do you put the build.xml files? Do you put one in each project, or do you put them in a separate project that contains all your Ant-related files?
The usual recommendation is to put a build.xml in each project. But this has a few drawbacks:
It makes it hard to reuse common targets in multiple projects.
Sometimes you want to use Ant to export a project from source control and deploy it. Obviously you can't do this if the build file is in the project itself.
But if you put them all in a common location:
People need to be aware of their location to use them; they can't just use "ant -find" to find the current project's file.
You can't have different build instructions for different branches of the project.
What do you guys do?
EDIT: Thanks for the good suggestions so far. As far Maven, these aren't Java projects, and I get the impression that Maven is only meant for Java.
Place the Ant files with the project. That is the de facto standard and recommended by the creator of Ant. I will try to address some of the issues you have brought up:
Reuse of common targets should be done using techniques as described by Eric Hatcher in his book Java Development with Ant. Basically, you extract all commonality into a some top level files that all other Ant files "inherit" from.
Using Ant to export a project from source control seems odd to me, but if you want to do this, use a different Ant file :-) You can make a target like ant export -Dproject=foo/bar.
For Ant, I recommend you grab that book - it has a ton of helpful techniques.
The real recommendation I would make though is to drop Ant and convert over to Maven - like the Apache Software Foundation did (they maintain both Ant and Maven).
If you're working with independent projects, you can:
put your build.xml at the top level
place common Ant definitions (Antlib) into another project (e.g. config)
use svn:externals to import the common Antlib definition (from 'config') into your project
EDIT The trick with svn:externals is that if you link to the HEAD of some common files, it may happen that they will change after a couple of months/years. So each time you tag, you should change the svn:externals to point to a fix version of the included project. This may come handy when a project has to be rebuild years after it was last built.
My rule of thumb is to put the build.xml file in the directory under which all files are referenced. In other words, no relative paths should start with "../". Where I live, that usually means putting it in the "trunk" directory, which has src, lib, build, docs, etc underneath it.
Doing this makes the paths much cleaner in the file, and it makes it obvious how to build the project.
Where I have multiple projects that need to build, I will create a separate build.xml for each project, and a central build.xml in the directory all the project are in that calls those other build.xml files. That gives you a lot of flexibility with very little work.
I'd expect an Ant build file to be located at the top of a project (it's already a pain to have to look at a the build file to "discover" how to build the project, so if I have to locate it first, it'll drive me totally crazy). Now, regarding all the drawbacks you mentioned, I'm tempted to say: why don't you use Maven?
The way I have done this is in the past (Now I just use Maven):
Have a build.xml in the root of each project
Create an overarching build.xml
for all projects and place it in
the trunk of my repository
The overarching buid.xml has
checkout tasks for each project.
I am guessing when you mentioned
export from repository, you
actually meant import.
The overarching build file also
defines the dependencies, if any
You may update individual projects using each project's individual build file
If you do have common tasks defined, you may inherit from a common build file as well as someone else suggested.
Looks like your set of projects might be a good candidate for migration to Maven, I realize it is not always possible but if you have time, you might want to look into it.