Creating the new allure adapter for own framework - ant

I'm trying to create the adapter for our own framework. Our framework uses its own assert mechanism so I need to write the adapter.
The adapter class is very simple and it looks like this:
public class AllureReportListener {
private static AllureReportListener object = new AllureReportListener();
private Allure lifecycle = Allure.LIFECYCLE;
private String suiteUid = UUID.randomUUID().toString();
private Set<String> startedTestNames = Collections.newSetFromMap(
new ConcurrentHashMap<String, Boolean>());
public static AllureReportListener getReportListener()
{
return object;
}
public void onTestSuiteStart(String testCaseName)
{
getLifecycle().fire(new TestSuiteStartedEvent(
suiteUid,testCaseName
));
}
public void onTestSuiteFinish()
{
getLifecycle().fire(new TestSuiteFinishedEvent(suiteUid));
}
Allure getLifecycle() {
return lifecycle;
}
}
Our own test suite class calls these methods on right event times.
Since we have our own testing framework, we have our own ant task called ownrunner like this:
<target name="test">
<ownrunner classpathref="classpath" file="config/usecase/SEEDLoginCase.xml" parallel="Scenario" output="${build.report}">
</ownrunner>
</target>
I ran the ant build, but I didn't see any allure results in the build folder.
Now I'm struck. I want this ant task generates the allure xml results. What I need to do?

You need to start the aspect jar too, part of the ant build. Add this line to your build:
<jvmarg value="-javaagent:${currentDir}/aspectjweaver-1.8.0.jar"/>
This will help you to get the allure results, under the directory you have configured.

Related

where is `cc_proto_library` implementation in bazel?

I read the source code of bazel, and I found cc_binary and cc_library in src/main/starlark/builtins_bzl/common/cc.
Also I found proto_library in src/main/starlark/builtins_bzl/common/proto. But I can't find where is the cc_proto_library's implementation.
Can anyone tell me how it works?
Tip: use the Bazel codesearch at https://source.bazel.build to navigate its source code.
You can quickly find the implementation of any built-in rule using a query like:
language:java "implements RuleDefinition" "\"cc_proto_library\")"
This will bring you to the CcProtoLibraryRule.java definition:
public class CcProtoLibraryRule implements RuleDefinition {
private final CcProtoAspect ccProtoAspect;
public CcProtoLibraryRule(CcProtoAspect ccProtoAspect) {
this.ccProtoAspect = ccProtoAspect;
}
#Override
public RuleClass build(RuleClass.Builder builder, RuleDefinitionEnvironment environment) {
return builder
.requiresConfigurationFragments(CppConfiguration.class)
/* <!-- #BLAZE_RULE(cc_proto_library).ATTRIBUTE(deps) -->
The list of <code>proto_library</code>
rules to generate C++ code for.
<!-- #END_BLAZE_RULE.ATTRIBUTE --> */
.override(
attr("deps", LABEL_LIST)
.allowedRuleClasses("proto_library")
.allowedFileTypes()
.aspect(ccProtoAspect))
.build();
}
#Override
public Metadata getMetadata() {
return RuleDefinition.Metadata.builder()
.name("cc_proto_library")
.factoryClass(CcProtoLibrary.class)
.ancestors(BaseRuleClasses.NativeActionCreatingRule.class)
.build();
}
}
The implementation is defined with .factoryClass(CcProtoLibrary.class).
For the second part of your question, "builtins" are a Bazel-internal concept to transparently swap out the Java implementation of a Bazel rule to its Starlark equivalent, without needing to add any load statements in the BUILD file. This is necessary to migrate existing users to Starlark implementations without causing user impact.

Import groovy class in a pipeline Jenkinsfile

I need to be able to create classes and use them within a Jenkins pipeline.
Let's say I have a very simple groovy class, declared in a groovy script, looking as this:
class MyClass {
#Override
public String toString() {
return "toto";
}
}
return MyClass();
This class is located in the folder: Project\Buildfiles\Jenkins\com\external
Then in my Jenkinsfile I would do:
node('mynode') {
toto = load 'Project\Buildfiles\Jenkins\com\external\MyClass.groovy'
echo toto.toString()
}
And this actually works
However this do pose a certain numbers of issues with my IDE which does not understand what is happening. Also, this prevents me to have several constructor in my custom class.
What I have been trying to do, and for which I need help, is the following. In a file named ExternalClasses.groovy:
class Toto{
#Override
public String toString() {
return "toto";
}
}
class Tata{
#Override
public String toString() {
return "tata";
}
}
return this;
In the JenkinsFile:
node('mynode') {
external= load 'Project\Buildfiles\Jenkins\com\external\ExternalClasses.groovy'
toto = new Toto();
tata = new Tata();
}
And this fails
I have tried several approaches, used packages names, used the Toto.new() syntax, but none worked.
Any ideas ?
Edit about Shared Libraries:
I actually have a Shared library, it is used by several teams and contains very specific data which should be own by the teams and not by the library.
We need to be able to put out of the library things which does not belong to it. The purpose of this work is to alleviate the said library of non generic code.
You could use the Shared Library Feature. Upload your scripts into a VCS like Github/Bitbucket and use Jenkins-Jobs to execute them. They are available for all projects/jobs.

CDI in JUnit tests with Jersey Test Framework

We are using the Jersey Test Frameworks for API testing. In test-mode, we use an h2 database, mysql in production. Everything is fine to this point.
Now i want to write tests for our repositories to check if the data is written properly to the database.
I can't inject any classes in my tests so i am using the standard constructor the create an new instance of RepositoryA. Works for me.
Now the problem: RepositoryA is now injecting an instance of RepositoryB. And injection isn't working on test-scope.
Is it possible to get injection running in this environment?
Depending on the versions of the libraries you are using, running CDI in JUnit Test is different.
First you need to add this dependency, selecting the right version :
<dependency>
<groupId>org.jboss.weld</groupId>
<artifactId>weld-junit5</artifactId> // or weld-junit4
<version>1.3.0.Final</version>
<scope>test</scope>
</dependency>
Then you can enable Weld in your JUnit test. Here is an example of injecting a repository for an entity class called VideoGame :
#Slf4j
#EnableWeld
class VideoGameRepositoryTest
{
#WeldSetup
private WeldInitiator weld = WeldInitiator.performDefaultDiscovery();
#Inject
private VideoGameRepository repo;
#Test
void test()
{
VideoGame videoGame = VideoGameFactory.newInstance();
videoGame.setName("XENON");
repo.save(videoGame);
// testing if the ID field had been generated by the JPA Provider.
Assert.assertNotNull(videoGame.getVersion());
Assert.assertTrue(videoGame.getVersion() > 0);
log.info("Video Game : {}", videoGame);
}
}
The important parts are :
the #EnableWeld placed on the JUnit test class.
the #WeldSetup placed on a WeldInitiator field, to lookup to all annotated classes.
don't forget beans.xml in META-INF of your test classpath in order to setup the discovery-mode.
#Slf4j is a lombok annotation, you don't need it (unless you are already using Lombok)
Here the VideoGameRepository instance benefits injection as well, like in a classical CDI project.
Here is the code of the VideoGameFactory which gets a brand new instance of the entity class marked with #Dependent scope. This factory programmatically invokes the CDI current context.
public class VideoGameFactory
{
public static VideoGame newInstance()
{
// ask CDI for the instance, injecting required dependencies.
return CDI.current().select(VideoGame.class).get();
}
}
Alternately, you can have a look to Arquillian which can come with a full Java EE server in order to have all the needed dependencies.

How to populate parameter "defaultValue" in Maven "AbstractMojoTestCase"?

I have a Maven plugin that I am attempting to test using a subclass of the AbstractMojoTestCase. The plugin Mojo defines an outputFolder parameter with a defaultValue. This parameter is not generally expected to be provided by the user in the POM.
#Parameter(defaultValue = "${project.build.directory}/someOutputFolder")
private File outputFolder;
And if I use the plugin in a real scenario then the outputFolder gets defaulted as expected.
But if I test the Mojo using the AbstractMojoTestCase then while parameters defined in the test POM are populated, parameters with a defaultValue that are not defined in the POM are not populated.
public class MyPluginTestCase extends AbstractMojoTestCase {
public void testAssembly() throws Exception {
final File pom = getTestFile( "src/test/resources/test-pom.xml");
assertNotNull(pom);
assertTrue(pom.exists());
final MyMojo myMojo = (BaselineAssemblyMojo) lookupMojo("assemble", pom);
assertNotNull(myMojo);
myMojo.execute(); // Dies due to NullPointerException on outputFolder.
}
}
Further: if I define the outputFolder parameter in the POM like so:
<outputFolder>${project.build.directory}/someOutputFolder</outputFolder>
then ${project.build.directory} is NOT resolved within the AbstractMojoTestCase.
So what do I need to do to get the defaultvalue populated when testing?
Or is this a fault in the AbstractMojoTestCase?
This is Maven-3.2.3, maven-plugin-plugin-3.2, JDK 8
You need to use lookupConfiguredMojo.
Here's what I ended up using:
public class MyPluginTest
{
#Rule
public MojoRule mojoRule = new MojoRule();
#Test
public void noSource() throws Exception
{
// Just give the location, where the pom.xml is located
MyPlugin plugin = (MyPlugin) mojoRule.lookupConfiguredMojo(getResourcesFile("basic-test"), "myGoal");
plugin.execute();
assertThat(plugin.getSomeInformation()).isEmpty();
}
public File getResourcesFile(String filename)
{
return new File("src/test/resources", filename);
}
}
Of course you need to replace myGoal with your plugin's goal. You also need to figure out how to assert that your plugin executed successfully.
For a more complete example, check out the tests I wrote for fmt-maven-plugin

How do I unit test a custom ant task?

I am writing a custom ant task that extends Task. I am using the log() method in the task. What I want to do is use a unit test while deveoping the task, but I don't know how to set up a context for the task to run in to initialise the task as if it were running in ant.
This is the custom Task:
public class CopyAndSetPropertiesForFiles extends Task {
public void execute() throws BuildException {
log("CopyAndSetPropertiesForFiles begin execute()");
log("CopyAndSetPropertiesForFiles end execute()");
}
}
This is the unit test code:
CopyAndSetPropertiesForFiles task = new CopyAndSetPropertiesForFiles();
task.execute();
When the code is run as a test it gives a NullPointerException when it calls log.
java.lang.NullPointerException
at org.apache.tools.ant.Task.log(Task.java:346)
at org.apache.tools.ant.Task.log(Task.java:334)
at uk.co.tbp.ant.custom.CopyAndSetPropertiesForFiles.execute(CopyAndSetPropertiesForFiles.java:40)
at uk.co.tbp.ant.custom.test.TestCopyAndSetPropertiesForFiles.testCopyAndSetPropertiesForFiles(TestCopyAndSetPropertiesForFiles.java:22)
Does anybody know a way to provide a context or stubs or something similar to the task?
Thanks,
Rob.
Accepted answer from Abarax. I was able to call task.setProject(new Project());
The code now executes OK (except no logging appears in th console - at least I can exercise the code :-) ).
Or better yet, decouple the task object itself from the logic (lets call it TaskImpl) inside the task - so that you can pass in your own dependencies (e.g., the logger). Then, instead of testing the task object, you test TaskImpl -> which you can pass in the logger, and any other weird bits and pieces it might need to do its job. Then unit testing is a matter of mocking the dependencies.
Looking at the Ant source code these are the two relevent classes: ProjectComponent and Task
You are calling the log method from Task:
public void log(String msg) {
log(msg, Project.MSG_INFO);
}
Which calls:
public void log(String msg, int msgLevel) {
if (getProject() != null) {
getProject().log(this, msg, msgLevel);
} else {
super.log(msg, msgLevel);
}
}
Since you do not have project set it will call "super.log(msg, msgLevel)"
public void log(String msg, int msgLevel) {
if (getProject() != null) {
getProject().log(msg, msgLevel);
} else {
// 'reasonable' default, if the component is used without
// a Project ( for example as a standalone Bean ).
// Most ant components can be used this way.
if (msgLevel <= Project.MSG_INFO) {
System.err.println(msg);
}
}
}
It looks like this may be your problem. Your task needs a project context.
Ant has a handy class called BuildFileTest that extends the JUnit TestCase class. You can use it to test the behaviour of individual targets in a build file. Using this would take care of all the annoying context.
There's a Test The Task chapter in the Apache Ant Writing Tasks Tutorial that describes this.

Resources