Select a different runner for cucumber.api.cli.Main? - jenkins

Is it possible to define/specify a runner when starting tests from cucumber's command line(cucumber.api.cli.Main)?
My reason for this is so i can generate xml reports in Jenkins and push the results to ALM Octane.
I kind of inherited this project and its using gradle to do a javaexect and call cucumber.api.cli.Main
I know its possible to do this with #RunWith(OctaneCucumber.class) when using JUnit runner + maven (or only JUnit runner), otherwise that tag is ignored. I have the custom runner with that tag but when i run from cucumber.api.cli.Main i can't find a way to run with it and my tag just gets ignored.

What #Grasshopper suggested didn't exactly work but it made me look in the right direction.
Instead of adding the code as a plugin, i managed to "hack/load" the octane reporter by creating a copy of the cucumber.api.cli.Main, using it as a base to run the cli commands and change a bit the run method and add the plugin at runtime. Needed to do this because the plugin required quite a few parameters in its constructor. Might not be the perfect solution, but it allowed me to keep the gradle build process i initially had.
public static byte run(String[] argv, ClassLoader classLoader) throws IOException {
RuntimeOptions runtimeOptions = new RuntimeOptions(new ArrayList<String>(asList(argv)));
ResourceLoader resourceLoader = new MultiLoader(classLoader);
ClassFinder classFinder = new ResourceLoaderClassFinder(resourceLoader, classLoader);
Runtime runtime = new Runtime(resourceLoader, classFinder, classLoader, runtimeOptions);
//====================Added the following lines ================
//Hardcoded runner(?) class. If its changed, it will need to be changed here also
OutputFile outputFile = new OutputFile(Main.class);
runtimeOptions.addPlugin(new HPEAlmOctaneGherkinFormatter(resourceLoader, runtimeOptions.getFeaturePaths(), outputFile));
//==============================================================
runtime.run();
return runtime.exitStatus();
}

Related

Apache Beam exception when running wordcount example

I think I followed very step on the document, but I still ran into this exception. (the only different is that I run this from Eclipse J2EE, but I won't expect this really maters, doesn't it?)
Code: (I didn't write this, it's right from the beam project example). I think you'd have to specify a google cloud platform project and provide the right credential to access it. However, I didn't find anywhere in this example project that does the setting up.
public static void main(String[] args) {
// Create a PipelineOptions object. This object lets us set various execution
// options for our pipeline, such as the runner you wish to use. This example
// will run with the DirectRunner by default, based on the class path configured
// in its dependencies.
PipelineOptions options = PipelineOptionsFactory.create();
// Create the Pipeline object with the options we defined above.
Pipeline p = Pipeline.create(options);
// Apply the pipeline's transforms.
// Concept #1: Apply a root transform to the pipeline; in this case, TextIO.Read to read a set
// of input text files. TextIO.Read returns a PCollection where each element is one line from
// the input text (a set of Shakespeare's texts).
// This example reads a public data set consisting of the complete works of Shakespeare.
p.apply(TextIO.Read.from("gs://apache-beam-samples/shakespeare/*"))
.....
)
Exception:
Exception in thread "main" java.lang.IllegalStateException: Failed to validate gs://apache-beam-samples/shakespeare/*
at org.apache.beam.sdk.io.TextIO$Read$Bound.expand(TextIO.java:309)
at org.apache.beam.sdk.io.TextIO$Read$Bound.expand(TextIO.java:205)
at org.apache.beam.sdk.runners.PipelineRunner.apply(PipelineRunner.java:76)
at org.apache.beam.runners.direct.DirectRunner.apply(DirectRunner.java:296)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:388)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:302)
at org.apache.beam.sdk.values.PBegin.apply(PBegin.java:47)
at org.apache.beam.sdk.Pipeline.apply(Pipeline.java:152)
at google.dataflow.beam.example.MinimalWordCount.main(MinimalWordCount.java:77)
Caused by: java.io.IOException: Unable to match files in bucket apache-beam-samples, prefix shakespeare/ against pattern shakespeare/[^/]*
at org.apache.beam.sdk.util.GcsUtil.expand(GcsUtil.java:234)
at org.apache.beam.sdk.util.GcsIOChannelFactory.match(GcsIOChannelFactory.java:53)
at org.apache.beam.sdk.io.TextIO$Read$Bound.expand(TextIO.java:304)
... 8 more
Caused by: com.google.api.client.http.HttpResponseException: 400 Bad Request
{
"error" : "invalid_grant"
}
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1070)
at com.google.auth.oauth2.UserCredentials.refreshAccessToken(UserCredentials.java:207)
at com.google.auth.oauth2.OAuth2Credentials.refresh(OAuth2Credentials.java:149)
at com.google.auth.oauth2.OAuth2Credentials.getRequestMetadata(OAuth2Credentials.java:135)
at com.google.auth.http.HttpCredentialsAdapter.initialize(HttpCredentialsAdapter.java:96)
at com.google.cloud.hadoop.util.ChainingHttpRequestInitializer.initialize(ChainingHttpRequestInitializer.java:52)
at com.google.api.client.http.HttpRequestFactory.buildRequest(HttpRequestFactory.java:93)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequest(AbstractGoogleClientRequest.java:300)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at com.google.cloud.hadoop.util.ResilientOperation$AbstractGoogleClientRequestExecutor.call(ResilientOperation.java:166)
at com.google.cloud.hadoop.util.ResilientOperation.retry(ResilientOperation.java:66)
at com.google.cloud.hadoop.util.ResilientOperation.retry(ResilientOperation.java:103)
at org.apache.beam.sdk.util.GcsUtil.expand(GcsUtil.java:227)
... 10 more
Try to run it From command Prompt if using Windows.
Go to the folder containing pom.xml file and open cmd there.
then give command with the respective arguments.
mvn compile exec:java -Dexec.mainClass=org.apache.beam.examples.WordCount -Dexec.args=" --output=counts" -Pdirect-runner
If you want to run with your input file. Then make a txt file with any name and put it in the folder containing pom. And then Fire following Command.
mvn compile exec:java -Dexec.mainClass=org.apache.beam.examples.WordCount -Dexec.args="--inputFile=YOURFILENAME.txt --output=counts" -Pdirect-runner**
Hope this will do. Rest i am looking into your issue

Trigger Maven Release Remotely

I want to start a Maven release programmatically from a Java program. This webpage shows how that is done generally. So that's what I did:
final URL url = new URL("http://jenkins/job/MyProject/m2release/submit");
final HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setRequestMethod("POST");
urlConnection.setDoOutput(true);
final String userpass = "User:Password";
final String authentication = "Basic " + DatatypeConverter.printBase64Binary(userpass.getBytes());
urlConnection.setRequestProperty("Authorization", authentication);
try (final OutputStream os = urlConnection.getOutputStream();
final BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(os, "UTF-8"));) {
writer.write("releaseVersion=1.2.3&");
writer.write("developmentVersion=1.2.4-SNAPSHOT&");
// writer.write("isDryRun=on&"); // uncomment for dry run
writer.write("scmUsername=User&");
writer.write("scmPassword=Password&");
writer.write("scmCommentPrefix=[test]&");
writer.write("json={\"releaseVersion\":\"1.2.3\",\"developmentVersion\":\"1.2.4-SNAPSHOT\",\"isDryRun\":false}&");
writer.write("Submit=Schedule Maven Release Build");
writer.flush();
}
urlConnection.connect();
try (BufferedReader reader = new BufferedReader(
new InputStreamReader(urlConnection.getInputStream(), "UTF-8"))) {
for (String line; (line = reader.readLine()) != null;) {
System.out.println(line);
}
}
This forum suggests "just have a look at the form befor you do a release and you should be able to craft a cURL request", that's what I did to get that far. The release starts at least.
I just don't now how to escape everything. The browser shows spaces as "+", but it does not work if I send the data that way. In fact, neither " ", "+" nor "%20" works as a space.
Still I get Unable to commit files in the build, so I'm pretty sure something is wrong with the username / password / comment prefix. The Jenkins itself returns the log-in page ("Authentication required"), even though the log-in is send.
What is the correct way to trigger a Maven Release on a Jenkins?
Okay, the parameters missing in my old approach were:
writer.write("specifyScmCredentials=on&");
writer.write("specifyScmCommentPrefix=on&");
The statement that comes to mind here is "Not all Jenkins jobs are equal".
The code you have posted simply invokes a Jenkins job called MyProject with the parameters releaseVersion and developmentVersion.
However the code has nothing to do with what MyProject does. It could be a job designed to build a Maven project, or a Gradle project, or a .NET project.
What you want to do (invoke the Maven release plugin) is the responsibility of the Jenkins job itself.
Have a look at the configuration of MyProject, specifically invoking a Maven build step that runs the release plugin.
Useful links
Maven Release Plugin
Building a Maven Project in Jenkins
Providing Parameters to Jenkins Builds
Btw one more valuable thing to mention is that u can pass custom parameters as a part of this
http://jenkins/job/MyProject/m2release/submit?json={} request. In order to do that u have to define query parameter json -> for example
json={"parameter": {"name":"CUSTOM_PARAM_1", "value":"CUSTOM_PARAM_1_VALUE"}}
At the moment of execution it will be treated as parametrized job and u will have your parameter together with
MVN_RELEASE_VERSION=X
MVN_DEV_VERSION=X-SNAPSHOT
MVN_ISDRYRUN=false
CUSTOM_PARAM_1=CUSTOM_PARAM_1_VALUE

Jenkins, how to check regressions against another job

When you set up a Jenkins job various test result plugins will show regressions if the latest build is worse than the previous one.
We have many jobs for many projects on our Jenkins and we wanted to avoid having a 'job per branch' set up. So currently we are using a parameterized build to build eg different development branches using a single job.
But that means when I build a new branch any regressions are measured against the previous build, which may be for a different branch. What I really want is to measure regressions in a feature branch against the latest build of the master branch.
I thought we should probably set up a separate 'master' build alongside the parameterized 'branches' build. But I still can't see how I would compare results between jobs. Is there any plugin that can help?
UPDATE
I have started experimenting in the Script Console to see if I could write a post-build script... I have managed to get the latest build of master branch in my parameterized job... I can't work out how to get to the test results from the build object though.
The data I need is available in JSON at
http://<jenkins server>/job/<job name>/<build number>/testReport/api/json?pretty=true
...if I could just get at this data structure it would be great!
I tried using JsonSlurper to load the json via HTTP but I get 403, I guess because my script has no auth session.
I guess I could load the xml test results from disk and parse them in my script, it just seems a bit stupid when Jenkins has already done this.
I eventually managed to achieve everything I wanted, using a Groovy script in the Groovy Postbuild Plugin
I did a lot of exploring using the script console http://<jenkins>/script and also the Jenkins API class docs are handy.
Everyone's use is going to be a bit different as you have to dig down into the build plugins to get the info you need, but here's some bits of my code which may help.
First get the build you want:
def getProject(projectName) {
// in a postbuild action use `manager.hudson`
// in the script web console use `Jenkins.instance`
def project = manager.hudson.getItemByFullName(projectName)
if (!project) {
throw new RuntimeException("Project not found: $projectName")
}
project
}
// CloudBees folder plugin is supported, you can use natural paths:
project = getProject('MyFolder/TestJob')
build = project.getLastCompletedBuild()
The main test results (jUnit etc) seem to be available directly on the build as:
result = build.getTestResultAction()
// eg
failedTestNames = result.getFailedTests().collect{ test ->
test.getFullName()
}
To get the more specialised results from eg Violations plugin or Cobertura code coverage you have to look for a specific build action.
// have a look what's available:
build.getActions()
You'll see a list of stuff like:
[hudson.plugins.git.GitTagAction#2b4b8a1c,
hudson.scm.SCMRevisionState$None#40d6dce2,
hudson.tasks.junit.TestResultAction#39c99826,
jenkins.plugins.show_build_parameters.ShowParametersBuildAction#4291d1a5]
These are instances, the part in front of the # sign is the class name so I used that to make this method for getting a specific action:
def final VIOLATIONS_ACTION = hudson.plugins.violations.ViolationsBuildAction
def final COVERAGE_ACTION = hudson.plugins.cobertura.CoberturaBuildAction
def getAction(build, actionCls) {
def action = build.getActions().findResult { act ->
actionCls.isInstance(act) ? act : null
}
if (!action) {
throw new RuntimeException("Action not found in ${build.getFullDisplayName()}: ${actionCls.getSimpleName()}")
}
action
}
violations = getAction(build, VIOLATIONS_ACTION)
// you have to explore a bit more to find what you're interested in:
pylint_count = violations?.getReport()?.getViolations()?."pylint"
coverage = getAction(build, COVERAGE_ACTION)?.getResults()
// if you println it looks like a map but it's really an Enum of Ratio objects
// convert to something nicer to work with:
coverage_map = coverage.collectEntries { key, val -> [key.name(), val.getPercentageFloat()] }
With these building blocks I was able to put together a post-build script which compared the results for two 'unrelated' build jobs, then using the Groovy Postbuild plugin's helper methods to set the build status.
Hope this helps someone else.

How to queue another TFS (2012) Build from a TFS Build AND pass process parameters?

The product I work on comprises 3/4 seperate (non-dependant) TFS builds.
I would like to create a single TFS build which queues the other 3/4 builds from within the ProcessTemplate AND, critically, pass process parameters to them. This build would wait for them all to complete and return an overall success/failure of the build.
So my questions are:
Can this be achieved by any existing 'standard' Workflow activities (my manager has had bad experiences with custom workflow activities)?
If not, I am able to 'shell out' to powershell. Can I achieve what I want from within Powershell (accessing the API)?
Maybe using TFSBuild.exe? But I can't find a way of passing the custom process parameters I need.
Any assistance or guidance would be appreciated.
UPDATE
The following powershell script will execute the build, but I'm still at a loss to be able to pass my custom process parameters :-(
function Get-BuildServer
{
param($serverName = $(throw 'please specify a TFS server name'))
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")
[void][System.Reflection.Assembly]::LoadWithPartialName ("Microsoft.TeamFoundation.Build.Client")
$tfs = [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory]::GetServer($serverName)
return $tfs.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer])
}
$buildserver = Get-BuildServer "http://tfsserver:8080/tfs/My%20Project%20Collection"
$teamProject = "ESI"
$buildDefinition = "iPrl_BuildMaster"
$definition = $buildserver.GetBuildDefinition($teamProject, $buildDefinition)
$request = $definition.CreateBuildRequest()
$buildserver.QueueBuild($request, "None")
Now after googling, I have found the following C# code to update the verbosity and, assuming it's the same for my custom process parameters, I need to convert this to work with the above powershell script. Any ideas?
IDictionary<String, Object> paramValues = WorkflowHelpers.DeserializeProcessParameters(processParameters);
paramValues[ProcessParameterMetadata.StandardParameterNames.Verbosity] = buildVerbosity;
return WorkflowHelpers.SerializeProcessParameters(paramValues);

How to get build definition from TFS and pass it to the external program

How to get build definition from TFS and pass it to the external program
This is what we are doing manually:
1) Queue new build
2) Once build is completed go to the drop folder and get the exe name
3) pass this exe name to the test automation program and run it.`
I want to automate these 3 steps.
Is it possible to get the build definition programatically?
Create a custom Build Template. Use a copy of the default (or what ever you're using now) as your starting point. Look in the work flow where BuildDetail.CompilationStatus = BuildPhaseStatus.Succeeded. You will then have the opportunity to invoke another application, it would be a stub program/powershell script/any other executable process. you can pass the path of the build that you just completed by using BuildDetail.DropLocation.
Assuming that your step #1 has executed, this latest (successful!) build is reachable as the lastKnownGoodBuild of the specific build definition.With this in mind you can employ a console app that bases on the following:
using System;
using System.IO;
using Microsoft.TeamFoundation.Build.Client;
using Microsoft.TeamFoundation.Client;
namespace BuildDropLocation
{
class Program
{
static void Main()
{
TfsTeamProjectCollection teamProjectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://yourTFSServerUri"));
var buildService = (IBuildServer)teamProjectCollection.GetService(typeof(IBuildServer));
IBuildDefinition myBuildDefinition = buildService.GetBuildDefinition("TeamProjectName", "BuildDefinitionName");
Uri lastKnownGoodBuild = myBuildDefinition.LastGoodBuildUri;
IBuildDetail myBuildDetail = buildService.GetBuild(lastKnownGoodBuild);
string[] myExeFiles = Directory.GetFiles(myBuildDetail.DropLocation, "*.exe",SearchOption.AllDirectories);
foreach (var exeFile in myExeFiles)
{
Console.WriteLine(myExeFile);
}
}
}
}
With the above you can retrieve the path to any *.exe under the drop location of the last build of build definition BuildDefinitionName that lives in Team Project TeamProjectName.This approach allows you to fully separate your TFS-Build with the execution of your tests. You can, for example, schedule this console-app to execute every night and invoke your runner to operate on the latest successful build.In case you would like the build and the testrun to be coupled in any way, you should proceed as #TimWagaman suggests by invoking your test runner during build. This 'coupling' might include:
The test results are contained in the build log
A failure generates a Bug
Test coverage is reportable
In this case, your tests will execute with each and every build that doesn't break in the compilation phase.
<MakeDir Directories="$(TemporaryFolder)" />
<Exec Condition=" '$(IsInTeamBuild)'=='True'" Command=""$(TfsTask)" history ../ /r /noprompt /stopafter:1 /version:W > "$(TemporaryFolder)\grab-changeset.txt"" />
<Exec Condition=" '$(IsInTeamBuild)'=='True'" Command=""$(TfsTask)" properties "$(MyMSBuildStartupDirectory)\all-companies-run-after-update.js" > "$(TemporaryFolder)\grab-properties.txt"" />
We use the above to extract: build#, branch, revision#
from the generated .txt files.

Resources