I am trying to exclude code in jococo. I have given for the package that I want to exclude and vice versa tags in my code for the package that I want to include.
but still ECLEMMA while junit coverage is reading whole code i.e. all packages and hence my code coverage goes down.
Here is my code:
<includes>
<include>com.cfgh.controller.*</include>
<include>com.cfgh.service.*</include>
<include>com.cfgh.repository.*</include>
</includes>
<excludes>
<exclude>com.cfgh.config.*</exclude>
<exclude>com.cfgh.model.dto.*</exclude>
<exclude>com.cfgh.model.entity.*</exclude>
<exclude>com.cfgh.repository.*</exclude>
<exclude>com.cfgh.exception.handler.*</exclude>
The include and exclude tags appear in green in pom.xml file, but still it is not reading it.
Can someone please guide me here in right direction.Thanks in advance
Related
Lets say I needed to access a groupId inside the properties tag of my POM file. How can i access that in my mule configuration file.
I tried to access it like ${pom.properties.app.groupId} but it didn't work. Any ideas?
This question has been asked before at Mule 4 : Is there a way to refer Maven POM properties from a Mule Flow? and probably others.
I'll repeat my answer here because Stackoverflow doesn't allow me to mark this one as a duplicate.
That's because Maven properties do not exists anymore when the
application is executed.
You can use some Maven plugins to replace values in a properties
files, that can be used inside the application. For example the
Maven Resources plugin. Be careful of not override properties inside other
files, like the XML Mule configurations of the application.
Update: I have created an example:
Assume that you have a properties file called config.properties in which you want to put the value of Maven property name:
a=1
b=${some.property}
Then you can enable Maven property filtering in just that file with:
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>false</filtering>
</resource>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
<includes>
<include>config.properties</include>
</includes>
</resource>
</resources>
...
My blog has a longer explanation of this example: https://medium.com/#adobni/using-maven-properties-as-mule-properties-3f1d0db62c43
I am on the verge of pulling all my hair out, someone please help me..
I am using JMeter 3.0 and am trying to generate the dashboard report from my jtl files, but I get the error -
result.jtl' does not contain the field names header, ensure the jmeter.save.saveservice.* properties are the same as when the CSV file was created or the file may be read incorrectly
my user.properites file contains -
jmeter.save.saveservice.output_format=csv
jmeter.save.saveservice.bytes=true
jmeter.save.saveservice.label=true
jmeter.save.saveservice.latency=true
jmeter.save.saveservice.response_code=true
jmeter.save.saveservice.response_message=true
jmeter.save.saveservice.successful=true
jmeter.save.saveservice.thread_counts=true
jmeter.save.saveservice.thread_name=true
jmeter.save.saveservice.time=true
jmeter.save.saveservice.timestamp_format=ms
jmeter.save.saveservice.timestamp_format=yyyy-MM-dd HH:mm:ss
jmeter.save.saveservice.print_field_names=true
these values are the same in the jmeter.properties file as well, just to ensure I haven't lost anything...
I really can't work out why I can't get the jtl to include the headers, I have followed every guide I can find, and I seem to be doing it right..
Can someone point to me what I am missing, or include a zipped version of their jmeter with it all working that I can try and point my ant project to?
Hope someone can help.
Double check <jmeter> section of your build.xml file. Default JMeter Ant Task assumes XML out put format for .jtl result files so if you have the following line:
<property name="jmeter.save.saveservice.output_format" value="xml"/>
just comment it out or delete it and your issue should be resolved.
I don't think JMeter Ant Task respects overrides via user.properties file, it is better to use jmeterproperties attribute or explicitly specify the relevant configuration in the Ant build file like:
<target name="test">
<jmeter
jmeterhome="${jmeter.home}"
testplan ="${testpath}/${test}.jmx"
resultlog="${testpath}/${test}.jtl">
<property name="jmeter.save.saveservice.output_format" value="csv"/>
<property name="jmeter.save.saveservice.print_field_names" value="true"/>
<property name="jmeter.save.saveservice.timestamp_format" value="ms"/>
<!--etc.-->
</jmeter>
</target>
I would also recommend choosing one of jmeter.save.saveservice.timestamp_format properties (either ms or yyyy-MM-dd HH:mm:ss as it might cause problems with the dashboard generation), having duplicate property names with different values is not a very good practice.
See Five Ways To Launch a JMeter Test without Using the JMeter GUI article for more information on running JMeter tests via Ant task and other ways of kicking off a JMeter test
I had noticed before you posted, but it is correct, the XML type was hardcoded in the build.xml, now I have changed that, all is working :)
I have a projects in JAVA that I analyze using sonar. Some of the java packages that I have are all under source folder. I also have some test file that I have under a different folders. Now, in Sonar, I organize my projects under a different structure, i.e. for a project "search", I only wants to include "search" package. These exclusion is quite easy to accomplished using sonar.exclusion properties. My question, though, is how about the test? how can I exclude some of the packages? Because from my testing, even though my source and test folder are using the same structure, the test packages are not automatically excluded when I specified "sonar.exclusions".
my folder structure:
/src/com/domain/
-- search/
-- utils/
-- pooling/
-- category/
/test/src/com/domain/
-- utils/
-- pooling/
Sonar properties:
<property name="sonar.sources" value="${path}/src" />
<property name="sonar.tests" value="${path}/test/src" />
<property name="sonar.exclusions" value="com/domain/utils/**/*,com/domain/pooling/**/*,com/domain/category/**/*" />
So, I am trying to only include the "search" package. The code above works in a way that it causes SONAR to only analyze my "search" package. This package can be seen in the SONAR "Components" tab. Unfortunately, in addition to the "search" component, I can also see the "util" and "pooling" components. I have done some testing and certain that these two components (utils and pooling) are the result of "sonar.tests" properties. Just a note though, even though "util" and "pooling" shows up in components, SONAR shows zero files under both of them. So going back to my question, is there anyway that I can do to exclude "util" and "pooling" from showing up under "Components"? Maybe using properties (i.e. sonar test exclusions)?
Btw, I am using SONAR 2.11 and is running under Red Hat linux. I'm using SONAR-TASK 1.2.
Any help is welcomed and appreciated! Thanks!
You can define exclusions in the Configurations for the project directly in sonar.
From the documentation:
Since version 3.3, it is also possible to:
Exclude tests file from being analyzed:
go to Configuration > Settings > Exclusions and set > the sonar.test.exclusions property
The trick is:
sonar.exclusions: excludes files from sources directory (i.e.sonar.sources), it has no effect on tests directory.
sonar.test.exclusions: excludes files from tests directory (i.e.sonar.tests), it has no effect on sources directory.
See https://docs.sonarqube.org/display/SONAR/Narrowing+the+Focus
And, Using sonar.test.exclusions with Sonarqube 6.3
I'm trying to build my flex 4 project using ant. In Flash Builder 4, in project properties it's possible to set the "Framework linkage" to one of "Merged into code", "Runtime Shared Library (RSL)" or "Use SDK Default (Runtime Shared library)". How can I set the equivalent as mxmlc options in build.xml?
My current build.xml looks like this:
<target name="myapp">
<mxmlc
file="${PROJECT_ROOT}/myapp.mxml"
output="${DEPLOY_DIR}/myapp.swf"
actionscript-file-encoding="UTF-8"
keep-generated-actionscript="false"
warnings="false" optimize="true" incremental="false" >
<load-config filename="${FLEX_HOME}/frameworks/flex-config.xml"/>
<source-path path-element="${FLEX_FRAMEWORKS}"/>
<compiler.debug>true</compiler.debug>
<runtime-shared-library-path path-element="${FLEX_FRAMEWORKS}/libs/framework.swc">
<url rsl-url="framework_4.0.0.14159.swz"/>
<url rsl-url="framework_4.0.0.14159.swf"/>
</runtime-shared-library-path>
<compiler.source-path path-element="src"/>
<!-- List of external libraries -->
<compiler.source-path path-element="${MY_LIB}/src" />
<!-- List of SWC files or directories that contain SWC files. -->
<compiler.library-path dir="libs" append="true">
<include name="*.swc" />
</compiler.library-path>
<copy todir="${DEPLOY_DIR}" file="${FLEX_FRAMEWORKS}/rsls/framework_4.0.0.14159.swz"/>
<copy todir="${DEPLOY_DIR}" file="${FLEX_FRAMEWORKS}/rsls/framework_4.0.0.14159.swf"/>
</mxmlc>
</target>
I assumed that setting the runtime-shared-library-path directive and copying the framework swf, swz files into my target folder would make things work, but this does not seem to be the case.
The way I'm assessing whether this works is as follows: I use a custom preloader, and for it to work I need to have framework linkage as RSL. With "merged into code", my preloader gets stuck at a certain point and does not progress to my application swf. This is the same behavior i see when i use the above build.xml, which makes me think that the SWF is being built with framework linkage merged into code (rather than RSL linked).
A related question to this is how to determine if my swf is using RSL or not. I guess I could look at the size of the compiled output. But it seems there should be a way to tell if I'm using the external framework file or it's being bundled into the SWF somehow, without my knowledge.
This is a little tricky because the documentation is a little scarce on this. You probably need to set the following option either on the command line or a config file.
static-link-runtime-shared-libraries=false
The documentation from Adobe gives the following slightly cryptic description of what this option does.
Determines whether to compile against libraries statically or use RSLs. Set this option to true to ignore the RSLs specified by the runtime-shared-library-path option. Set this option to false to use the RSLs. The default value is true.
This option is useful so that you can quickly switch between a statically and dynamically linked application without having to change the runtime-shared-library-path option, which can be verbose, or edit the configuration files.
Here is a link to the documentation.
"About the application compiler options"
Note that from the documentation the default value is true. HOWEVER if you are loading a flex-config.xml file (default or custom) you should also check if this setting is present in that file and what it is. In my experience the default value for the frameworks/flex-config.xml is actually false. It appears however that in the example above that this may be set the other way.
(We use a different build system than ANT so I am not that familiar with the build.xml syntax you would need.)
I'm trying to use xmltask for ant to modify a file in a subdirectory:
project/path/to/file.xml
The file refers to a DTD like this:
<!DOCTYPE data SYSTEM "mydtd.dtd">
I don't have the flexibility to change these documents.
This DTD is stored in the same subdirectory, which has always worked fine:
project/path/to/mydtd.dtd
Unfortunately, xmltask is trying to locate the dtd in my project's top-level directory, which is where my build file is located, and where I run from:
[xmltask] java.io.FileNotFoundException: /home/me/project/mydtd.dtd (The system cannot find the file specified)
I see in the xmltask documentation that I can correct this with an xmlcatalog element to tell it where to look up the file. But I need to use a dtd element, and I can only find examples for this element, not documentation; the examples show only a publicId, and if I understand XML correctly this document does not have one. I shouldn't need to specify this, anyway, right, since my document already says my DTD is stored locally and shows right where it is?
Why isn't xmltask finding the DTD correctly? What's the best way to correct or work around this situation?
An XML Catalog is the way to go here, it just needs a bit more perseverance.
As you correctly pointed out, the standard Ant <XmlCatalog> type only allows you to specify public DTD references when using the inline syntax, which is of no use to you. However, <XmlCatalog> also lets you specify a standard OASIS-syntax catalog, which is far richer, including resolving SYSTEM DTD references.
An OASIS catalog (full spec here) looks like this:
<catalog xmlns="urn:oasis:names:tc:entity:xmlns:xml:catalog">
<system systemId="mydtd.dtd" uri="project/path/to/mydtd.dtd"/>
</catalog>
You can then reference this catalog from the <XmlCatalog>:
<xmlcatalog refid="commonDTDs"/>
<catalogpath>
<pathelement location="path/to/oasis.catalog"/>
</catalogpath>
</xmlcatalog>
And that's that. It's a good idea to build up a reusable OASIS catalog file, and refer to it from various XML-related Ant tasks, all of which can use <XmlCatalog>.
As an alternative, it looks like I can skip the whole validation by creating a blank file with the same name as the DVD file, and then deleting the file when I am done. Odds are I am going to go that route instead of using the catalog.
xmltask isn't finding it because it is looking in the current working directory. Ant allows you to specify a base directory using the basedir attribute of the <target> element. So I suggest you try this:
<target basedir="path/to" ...>
<xmltask...
</target>
It strikes me that it is not the XML/DTD that you really have the problem with, but getting xmltask to interact with the two of them as you want.
If that fails, you could use the Ant Copy task to copy the XML and DTD to the root folder before processing with xmltask, then copying back again.
Have you tried:
<!DOCTYPE data SYSTEM "./path/to/mydtd.dtd">
? Or an absolute path?
Also, you can find <dtd> description here.
I had a similar problem where an XML file had a doctype with SYSTEM reference that could not be changed.
<!DOCTYPE opencms SYSTEM "http://www.opencms.org/dtd/6.0/opencms-modules.dtd">
I first went down the road and created a catalog file with the OASIS catalog as described above, but to be able to use external catalogs I had to include the Apache Commons Resolver 1.1 (resolver.jar) in the Ant classpath (see http://ant.apache.org/manual/Types/xmlcatalog.html).
Because I had multiple machines on which this build was supposed to run this seemed overkill, especially since xmltask worked fine if I just removed the doctype definition. I wasn't allowed to remove it permanently because the doctype was needed elsewhere.
Ultimately I used this workaround: I commented out the doctype definition using Ant's replace task, ran the xmltask, and then put the doctype back into the file.
<replace file="myxmlfile.xml">
<replacetoken><!DOCTYPE opencms SYSTEM "http://www.opencms.org/dtd/6.0/opencms-modules.dtd"></replacetoken>
<replacevalue><!-- !DOCTYPE opencms SYSTEM "http://www.opencms.org/dtd/6.0/opencms-modules.dtd" --></replacevalue>
</replace>
<xmltask .../>
<replace file="${local.opencms.webapp.webinf}/config/opencms-modules.xml">
<replacetoken><!-- !DOCTYPE opencms SYSTEM "http://www.opencms.org/dtd/6.0/opencms-modules.dtd" --></replacetoken>
<replacevalue><!DOCTYPE opencms SYSTEM "http://www.opencms.org/dtd/6.0/opencms-modules.dtd"></replacevalue>
</replace>