How can I access POM file properties inside mule configuration files? - pom.xml

Lets say I needed to access a groupId inside the properties tag of my POM file. How can i access that in my mule configuration file.
I tried to access it like ${pom.properties.app.groupId} but it didn't work. Any ideas?

This question has been asked before at Mule 4 : Is there a way to refer Maven POM properties from a Mule Flow? and probably others.
I'll repeat my answer here because Stackoverflow doesn't allow me to mark this one as a duplicate.
That's because Maven properties do not exists anymore when the
application is executed.
You can use some Maven plugins to replace values in a properties
files, that can be used inside the application. For example the
Maven Resources plugin. Be careful of not override properties inside other
files, like the XML Mule configurations of the application.
Update: I have created an example:
Assume that you have a properties file called config.properties in which you want to put the value of Maven property name:
a=1
b=${some.property}
Then you can enable Maven property filtering in just that file with:
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>false</filtering>
</resource>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
<includes>
<include>config.properties</include>
</includes>
</resource>
</resources>
...
My blog has a longer explanation of this example: https://medium.com/#adobni/using-maven-properties-as-mule-properties-3f1d0db62c43

Related

How to exclude code (packages) in jococo, eclemma

I am trying to exclude code in jococo. I have given for the package that I want to exclude and vice versa tags in my code for the package that I want to include.
but still ECLEMMA while junit coverage is reading whole code i.e. all packages and hence my code coverage goes down.
Here is my code:
<includes>
<include>com.cfgh.controller.*</include>
<include>com.cfgh.service.*</include>
<include>com.cfgh.repository.*</include>
</includes>
<excludes>
<exclude>com.cfgh.config.*</exclude>
<exclude>com.cfgh.model.dto.*</exclude>
<exclude>com.cfgh.model.entity.*</exclude>
<exclude>com.cfgh.repository.*</exclude>
<exclude>com.cfgh.exception.handler.*</exclude>
The include and exclude tags appear in green in pom.xml file, but still it is not reading it.
Can someone please guide me here in right direction.Thanks in advance

Configure spring security ldap-server attribute to use different url based on deployed environment

We are using spring security and have it working well. I am trying to figure out one thing that has not being obvious - how do I configure ldap-server attribute to use different url based on deployed environment?
This is what I have that is working:
<ldap-server url="ldap://testserver:port/o=blah" manager-dn="cn=bind,ou=Users,o=blah" manager-password="password"/>
<authentication-manager id="authenticationManager" alias="authenticationManager">
<ldap-authentication-provider
user-search-filter="(cn={0})"
user-search-base="ou=Users"
group-search-filter="(uniqueMember={0})"
group-search-base="ou=groups"
group-role-attribute="cn"
role-prefix="none">
</ldap-authentication-provider>
Now, how do I configure it to use a different url based on deployed environment?
thanks in advance,
Sharath
I've done that with Spring profiles:
In your spring.*.xml config file use this at the end of your file:
<beans profile="production">
...
</beans>
<beans profile="local">
...
</beans>
As VM Arguments the used profile must be provided:
-Dspring.profiles.active=production
Regards
You can use the url as variables and set them in a properties file.
To change the properties file should be easier. I know you can do that with Maven - with jar or war plugin depending on packaging, including generating two (or more) packages with one execution - but I suppose you can with Ant or other managers too.
Of course, you could use that solution to change the whole xml, but it's easier to do that with a properties file because that way, when changing the configuration, the markup will not be in the way, only variables and values.

Published artifact patterns in Ivy

When I resolve artifacts from my repository (e.g. filesystem), I use two artifact patterns:
<artifact pattern="${location}/[organisation]/[module]/[revision]/[artifact]-[revision].[ext]"/>
<artifact pattern="${location}/[organisation]/[module]/[revision]/[artifact]-[revision]-[type]s.[ext]"/>
The first one is for jar files, and the second one is for sources or other types of artifacts.
I'd like to be able to publish artifacts the same way, but I don't know how.
Using just the patterns above, the publish task seems to consider only the first one, thus removing the type. If multiple artifacts have the same name and extension, they will be overwritten.
If I just use the second pattern, then for jar artifacts it makes ${artifact}-${revision}-jars.jar which is really ugly.
Finally, it seems to be possible to have optional parts in patterns, such as:
<artifact pattern="${location}/[organisation]/[module]/[revision]/[artifact]-[revision](-[type]s).[ext]"/>
But the -[type]s part is omitted only if the type is null or empty, and I'd like the type to remain "jar", in which case the part is not omitted.
So is there any other way?
Why don't you use ivy.xml files for artifacts? You should need to create ivy.xml and place them to your module folder, near jar files. ivy.xml example:
<ivy-module version="1.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://ant.apache.org/ivy/schemas/ivy.xsd">
<info organisation="com.organisation" module="foo" revision="1.0"/>
<publications>
<artifact name="foo"/>
<artifact name="foo-sources" type="source" ext="zip"/>
</publications>
</ivy-module>
Then you should define ivy pattern in your resolver:
<ivy pattern="${location}/[organisation]/[module]/[revision]/ivy.xml"/>
Now if you use <dependency org="com.organisation" name="foo" rev="1.0"/> you will get all artifacts described in ivy.xml. There is a way to select only the needed artifacts.
Not really a solution, but a slightly better way is:
<artifact pattern="${location}/[organisation]/[module]/[revision]/[type]s/[artifact]-[revision].[ext]"/>
I struggled with the same.
I found the solution, you can use:
[artifact](-[classifier]).[ext]
-classifier will be null/empty on the jar on sources/javadoc-jars it contains -sources/-javadoc.jar
I know its been a while, but I found this question by google. So for any future person it will be helpful I think.

spring.net used in a class library

I am attempting to use spring.net 's IoC conatiner in a class library which in and of itself is not an executable. A web project simply calls this library , this library contains the references to the spring binaries and to spring's config files.
Essentially the question is:
Does spring.net need to reside in an executable to start, or canit reside in a classs library that will be referenced by an executable?
Any help will be appreciated.
You can include part of your configuration in the class library project as an embedded resource file. Let's say you called it LibraryConfig.xml. Then in your executable's application config file, you include the embedded resource using the assembly: prefix. Here's an example:
<spring>
<context type="Spring.Context.Support.XmlApplicationContext, Spring.Core">
<resource uri="assembly://FooLibrary/FooLibrary/LibraryConfig.xml"/>
<resource uri="config://spring/objects" />
</context>
<objects xmlns="http://www.springframework.net">
<object id="mainForm" type="FooApp.MainForm, FooApp">
<!-- mainController is some object defined in LibraryConfig.xml -->
<property name="Controller" ref="mainController"/>
</object>
</objects>
</spring>
If your main application doesn't need to use Spring itself, I think you can set up the whole application context in the library. Embed the config file as described above, then define a singleton object to hold the application context and load it from the embedded config file. Finally, you need to define some kind of factory methods for the client code to create your classes with. The factory methods can either go on the singleton itself (probably using generics), or have a separate factory method on each class that needs to be instantiated. Those factory methods make the actual requests from the application context and the client code never sees it.
it can reside in a dll which is referenced by an executable, but make sure that the configuration is included in (or referenced by) the executable's config file.

Avoiding re-building prerequisites in Ant

I have an existing Ant project and would like to speed up the build process
by avoiding re-building components that are already up to date.
Ant permits you to specify that one target depends on another, but by
default every prerequisite is always rebuilt, even if it is already up to
date. (This is a key difference between Ant and make. By default, make
only re-builds a target when needed -- that is, if some prerequisite is
newer.)
<uptodate property="mytarget.uptodate"> // in set.mytarget.uptodate task
...
</uptodate>
<!-- The prerequisites are executed before the "unless" is checked. -->
<target name="mytarget" depends="set.mytarget.uptodate" unless="mytarget.uptodate">
...
</target>
To make Ant re-build prerequisites only if necessary, there seem to be two
general approaches within Ant.
The first approach is to use the uptodate task to set a property. Then,
your task can test the property and build only if the property is (not)
set.
<uptodate property="mytarget.uptodate"> // in set.mytarget.uptodate task
...
</uptodate>
<!-- The prerequisites are executed before the "unless" is checked. -->
<target name="mytarget" depends="set.mytarget.uptodate" unless="mytarget.uptodate">
...
</target>
An alternate first approach is to use the outofdate task from ant contrib.
It's nicer in that it is just one target without a separate property being
defined; by contrast, outofdate requires separate targets to set and to
test the property.
The second approach is to create a <fileset> using the <modified>
selector. It calculates MD5 hashes for files and selects files whose MD5
differs from earlier stored values. It's optional to set
<param name="cache.cachefile" value="cache.properties"/>
inside the selector; it defaults to "cache.properties". Here is
an example that copies all files from src to dest whose content has
changed:
<copy todir="dest">
<fileset dir="src">
<modified/>
</fileset>
</copy>
Neither of these is very satisfactory, since it requires me to write Ant
code for a process (avoiding re-building) that ought to be automatic.
There is also Ivy, but I can't tell from its documentation whether it
provides this feature. The key use case in the Ivy documentation seems to
be downloading subprojects from the Internet rather than avoiding wasted
work by staging the parts of a single project. Maven provides similar
functionality, with the same use case highlighted in its documentation.
(Moving an existing non-trivila project to Maven is said to be a nightmare;
by contrast, starting greenfield development with Maven is more palatable.)
Is there a better way?
This conditional compilation of a large build is a feature of make that I initally missed in ANT. Rather than use target dependencies, I'd suggest dividing your large project into smaller modules, each publishing to a common shared repository.
Ivy can then be used to control the component versions used by the main module of the project.
<ivy-module version="2.0">
<info organisation="com.myspotontheweb" module="multi_module_project"/>
<publications>
<artifact name="main" type="jar"/>
</publications>
<dependencies>
<dependency org="com.myspotontheweb" name="component1" rev="latest.integration"/>
<dependency org="com.myspotontheweb" name="component2" rev="latest.integration"/>
<dependency org="com.myspotontheweb" name="component3" rev="latest.integration"/>
<dependency org="com.myspotontheweb" name="component4" rev="latest.integration"/>
</dependencies>
</ivy-module>
The ivy:retrieve task will only download/copy one of the sub-modules if they have changed (published from their build files)
It all sounds more complicated but maybe you're already sub-dividing the project within your build file.... For example, if your ANT uptodate task is being made dependent on one the build artifacts.

Resources