In eclipse:
To correct "Plugin execution not covered by lifecycle configuration..." problem, I choose to create my lifecycle-mapping-metadata.xml instead of pollute my pom with IDE concerns.
I manage to write this file with example, but I can't find a xsd or DTD for lifecycle-mapping-metadata.xml. Where is it?
A typical lifecycle-mapping-metadata.xml file in the eclipse/m2e-core repo comes without an xsd reference.
But in that same m2e repo, you also have a org.eclipse.m2e.core/mdo/lifecycle-mapping-metadata-model.xml which helps to validate that xml file. It is used as a model in the main pom.xml, when used by the modello-maven-plugin.
Related
I am using Ant files for build
The build itself is done by IBM Rational Team Concert (RTC) with the help of this Ant file.
My problem is that if I make a mistake in the build XML itself like wrongly typed attribute name, this itself is detected by RTC after loading the files from source control (normally 15-20 mins)
Is there a way to verify (validate) the Ant XML file itself?
There is no schema for an Ant XML. As explained in the FAQ an incomplete DTD can be created but will not work:
An incomplete DTD can be created by the task - but this
one has a few problems:
It doesn't know about required attributes. Only manual tweaking of
this file can help here.
It is not complete - if you add new tasks via
it won't know about it. See this page by Michel Casabianca
for a solution to this problem. Note that the DTD you can download at
this page is based on Apache Ant 0.3.1.
It may even be an invalid DTD.
As Ant allows tasks writers to define arbitrary elements, name
collisions will happen quite frequently - if your version of Ant
contains the optional and tasks, there are two XML
elements named test (the task and the nested child element of )
with different attribute lists. This problem cannot be solved; DTDs
don't give a syntax rich enough to support this.
Again, the FAQ states the DTD is not (yet?) powerful enough to do this, but I found a preliminary work for Ant 1.6 based on Michel Casabianca's work at the AntDTD page on the Ant Wiki. As for me, I do not intend to use it.
Being a Maven newbie, I want to know if its possible to use multiple classifiers at once; in my case it would be for generating different jars in a single run. I use this command to build my project:
mvn -Dclassifier=bootstrap package
Logically I would think that this is possible:
mvn -Dclassifier=bootstrap,api package
I am using Maven 3.0.4
Your project seems like a candidate for refactoring into a couple of what Maven calls "modules". This involves splitting the code into separate projects within a single directory tree, where the topmost level is normally a parent or aggregator POM with <packaging>pom</packaging> and a <modules/> list containing the sub-project directory names.
Then, I'd advise putting the API interfaces/exceptions/whatnot into an api/ subdirectory with its own pom.xml, and putting the bootstrap classes into a bootstrap/ subdirectory with its own pom.xml. The top-level pom.xml would then list the modules like this:
<modules>
<module>api</module>
<module>bootstrap</module>
</module>
Once you've refactored the project, you will probably want to add a dependency from the bootstrap module to the api module, since I'm guessing the bootstrap will depend on interfaces/etc. from the api.
Now, you should be able to go into the top level of the directory structure and simply call:
mvn clean install
This approach is good because it forces you to think about how different use cases are supported in your code, and it makes dependency cycles between classes harder to miss.
If you want an example to follow, have a look at one of my github projects: Aprox.
NOTE: If you have many modules dependent on the api module, you might want to list it in the top-level pom.xml in the <dependencyManagement/> section, so you can leave off the version in submodule dependency declarations (see Introduction to the Dependency Mechanism).
UPDATE: Legacy Considerations
If you can't refactor the codebase for legacy reasons, etc. then you basically have two options:
Construct a series of pom.xml files in an empty multimodule structure, and use the build-helper-maven-plugin along with source includes/excludes to fragment the codebase and allocate the classes to different modules out of a single source tree.
Maybe use a plugin like the assembly plugin to carve up the target/classes directory (${project.build.directory}) and allocate classes to the different jars. In this scenario, each assembly descriptor requires an <id/> and by default this value becomes the classifier for that assembly jar. Under this plan, the "main" jar output will still be the monolithic one created by the Maven build. If you don't want this, you can use a separate execution of the assembly plugin, and in the configuration use <appendAssemblyId>false</appendAssemblyId>. If the output of that assembly is a jar, then it will effectively replace the old output from the jar plugin. If you decide to pursue this approach, you might want to read the assembly plugin documents to get as much exposure to different examples as you can.
Also, I should note that in both cases you would be stuck with manipulating the list of things produced by using a set of profiles in the pom in order to turn on/off different parts of the build. I'd highly recommend making the default, un-qualified build one that produces everything. This makes it more likely for things like the release plugin to catch everything you want to release, and up-rev versions, etc. appropriately.
These solutions are usually what I promote as migration steps when you can't refactor the codebase all at once. They are especially useful when migrating from something like an Ant build that produces multiple jars out of a single source tree.
In a Struts-2 Application, there are many Action classes and action-validation.xml files
for each Action classes how can I organize all these validation xml files into a single
folder such as in a "properties folder" and access validation rules from there.
A lot of it depends on what build system you are using. If you are using Maven to build, for example, you can place the validation.xml files in an identically named package inside of src/main/resources instead of src/main/java. In Ant, you could probably set it up anywhere and have Ant copy the xml files to the correct place in the final war file, though I am not sure how, not really having used Ant much.
I think the key thing is that in the final packaged war, the xml files need to be in the same place as the class files for that action. Where they are before build time doesn't so much matter.
I would like to simplify my main build scripts, and I'd like to have the ability to reuse certain common ant tasks as a library, but I'd also like them to be easily available as a package.
For instance, I've got a task which sets up the Flex environment variables that I need to build a variety of projects. I can (And am currently) include those scripts by relative path from another location in source control. But what I want to do is make a single download-able package that I can grab via Ivy that contains all of these generic tasks.
A jar seems the most natural solution, since this is doable from java (Use the class loader to access the file inside the jar.), but I can't seem to find a "native" way in Ant to just get the xml file.
In short, I want to do:
<import file="some.jar!bootstrap.xml">
But that doesn't work.
Is there someway to do this? Any other suggestions for making a library of ant scripts would be much appreciated as well.
From what I understand you're trying to extract a file containing more ant tasks from your jar and then tell ant to execute the tasks in those extracted files. Since the files are static, you'd probably be better off creating actual java Task definitions in your jar and declaring them in your ant build file. However, if you don't want to do that, you can just use the Unzip ant task to extract the resource out of the jar and onto the file system and then use the Ant task to execute the extracted file.
IIRC there's ongoing work in Ant to support this but it's not supported in any published version.
For those of you that use Ant with multiple projects, where do you put the build.xml files? Do you put one in each project, or do you put them in a separate project that contains all your Ant-related files?
The usual recommendation is to put a build.xml in each project. But this has a few drawbacks:
It makes it hard to reuse common targets in multiple projects.
Sometimes you want to use Ant to export a project from source control and deploy it. Obviously you can't do this if the build file is in the project itself.
But if you put them all in a common location:
People need to be aware of their location to use them; they can't just use "ant -find" to find the current project's file.
You can't have different build instructions for different branches of the project.
What do you guys do?
EDIT: Thanks for the good suggestions so far. As far Maven, these aren't Java projects, and I get the impression that Maven is only meant for Java.
Place the Ant files with the project. That is the de facto standard and recommended by the creator of Ant. I will try to address some of the issues you have brought up:
Reuse of common targets should be done using techniques as described by Eric Hatcher in his book Java Development with Ant. Basically, you extract all commonality into a some top level files that all other Ant files "inherit" from.
Using Ant to export a project from source control seems odd to me, but if you want to do this, use a different Ant file :-) You can make a target like ant export -Dproject=foo/bar.
For Ant, I recommend you grab that book - it has a ton of helpful techniques.
The real recommendation I would make though is to drop Ant and convert over to Maven - like the Apache Software Foundation did (they maintain both Ant and Maven).
If you're working with independent projects, you can:
put your build.xml at the top level
place common Ant definitions (Antlib) into another project (e.g. config)
use svn:externals to import the common Antlib definition (from 'config') into your project
EDIT The trick with svn:externals is that if you link to the HEAD of some common files, it may happen that they will change after a couple of months/years. So each time you tag, you should change the svn:externals to point to a fix version of the included project. This may come handy when a project has to be rebuild years after it was last built.
My rule of thumb is to put the build.xml file in the directory under which all files are referenced. In other words, no relative paths should start with "../". Where I live, that usually means putting it in the "trunk" directory, which has src, lib, build, docs, etc underneath it.
Doing this makes the paths much cleaner in the file, and it makes it obvious how to build the project.
Where I have multiple projects that need to build, I will create a separate build.xml for each project, and a central build.xml in the directory all the project are in that calls those other build.xml files. That gives you a lot of flexibility with very little work.
I'd expect an Ant build file to be located at the top of a project (it's already a pain to have to look at a the build file to "discover" how to build the project, so if I have to locate it first, it'll drive me totally crazy). Now, regarding all the drawbacks you mentioned, I'm tempted to say: why don't you use Maven?
The way I have done this is in the past (Now I just use Maven):
Have a build.xml in the root of each project
Create an overarching build.xml
for all projects and place it in
the trunk of my repository
The overarching buid.xml has
checkout tasks for each project.
I am guessing when you mentioned
export from repository, you
actually meant import.
The overarching build file also
defines the dependencies, if any
You may update individual projects using each project's individual build file
If you do have common tasks defined, you may inherit from a common build file as well as someone else suggested.
Looks like your set of projects might be a good candidate for migration to Maven, I realize it is not always possible but if you have time, you might want to look into it.