Integration of Jenkins with Xray with the help of Jenkinsfile - jenkins

I am getting this issue on the integration of Jenkins with Xray using Jenkinsfile. Error:-
Unable to confirm Result of the upload..... Upload Failed! Status:400 Response:{"error":"Error assembling issue data: project is required"}.
Does anyone have an idea about this issue and how to resolve it?

The issue is because you are using the endpointName: '/cucumber/multipart', when using this endpoint Xray expects either:
One file with the test results in json format and a second json configuration file where you configure other values (use inputInfoSwitcher: filePath and importInfo: <file_path>)
One file with the test results in json format and a json text with the extra configurations you want to pass to Xray ((use inputInfoSwitcher: fileContent and importInfo: <json_text>)
Notice that if the values of the configuration file or the json text are not correct they will not be used.
In the above case it seems that you are not correctly replacing the values in the json text.
You can find more info here: https://docs.getxray.app/display/XRAY/Integration+with+Jenkins#IntegrationwithJenkins-Pipelineprojectssupport

Related

Jenkins – How to add label to Jira issue from Jenkins pipeline

I developed automated tests in Java. The XML test report is generated with junit 5 and xray-junit-extension. This XML is currently being integrated in Jira/Xray, but unfortunately the labels are not being added to the issue.
I believe that the labels could be integrated in two different ways, 1) through this XML test report, or, alternatively, 2) through the Jenkins pipeline itself.
My XML contains the following property :
Click here to see the screenshot
Similarly to what is written in the Xray documentation :
Click in order to see the screenshot of the documentation
https://docs.getxray.app/display/XRAY/Taking+advantage+of+JU...
The only difference is that in the Xray documentation there is a wrap around the tags property. In my XML I do not have that wrap.
Do you happen to have any idea on why the label is not being added in Jira/Xray?
The second approach would be using the XrayImportBuilder to add a label, using importInfo
step([$class: 'XrayImportBuilder',
endpointName: '/junit',
importFilePath: '/reports/.xml',
projectKey: 'P34AMA',
importToSameExecution: 'false',
//testExecKey: 'TSTLKS-753',
serverInstance: '3146a388-d399-4e55-ae28-8c65404d6f9d',
credentialId:'55287529-194d-4e91-9964-7d740d8d2f61',
importInfo: "{ "fields": {"labels": ["label"]}",
//importInfo = '{"fields": {"labels": ["EOD"]} }'
])
Problem using XrayImportBuilder in Jenkins
But when adding importInfo to my pipeline it ends with an issue :
Click here to see the Jenkins logs
Clcik here to see the Jenkins Import Step
Is anyone aware of any other way to add a label to jira automatically without using the hudson.plugins.jira.JiraIssueUpdater ?
Thank you very much for your help!

Jenkins Xray Integration - Jira Issue Type with wrong character

In my Jennkins pipeline I have a Jira/Xray integration step :
step([$class: 'XrayImportBuilder',
endpointName: '/xunit',
fixVersion: '1.0',
importFilePath: '/MyFirstUnitTests/TestResults.xml',
importToSameExecution: 'true',
testExecKey: 'TSTLKS-753',
serverInstance: '9146a388-e399-4e55-be28-8c65404d6f9d',
credentialId:'75287529-134d-4s91-9964-7h740d8d2i63'])
Currently I'm having the following error :
ERROR: Unable to confirm Result of the upload..... Upload Failed!
Status:400 Response:{"error":"Issue with key
\u0027TSTLKS-753\u0027 does not exist or is not of type Test
Execution."}
But my issue (TSTLKS-753) is of type "Test Execution":
It appears that the string "\u0027" is being added both as a prefix
and as a suffix on my issue when building the pipeline.
I've searched for this string and it appears to be a Quotation Mark:
I tried out replacing it by double quotes. But I end up with the same error. Also tried to remove them.
In any case, if someone already got this error please let me know. Thank you very much
Can you confirm that the user that you have configured in Jenkins for the Xray instance has access to that Jira project where you have your Test Execution issue?
Can you try to import it without specifying testExecKey field, with importToSameExecution: 'false', and specifying the projectKey field using something like projectKey: 'TSTLKS' ?
If this last option returns an error (e.g. "project does not exist") then it's for sure a permission issue, so you'll either need to use a different Jira user/pass or fix the permissions on Jira side.

Exported Dataflow Template Parameters Unknown

I've exported a Cloud Dataflow template from Dataprep as outlined here:
https://cloud.google.com/dataprep/docs/html/Export-Basics_57344556
In Dataprep, the flow pulls in text files via wildcard from Google Cloud Storage, transforms the data, and appends it to an existing BigQuery table. All works as intended.
However, when trying to start a Dataflow job from the exported template, I can't seem to get the startup parameters right. The error messages aren't overly specific but it's clear that for one thing, I'm not getting the locations (input and output) right.
The only Google-provided template for this use case (found at https://cloud.google.com/dataflow/docs/guides/templates/provided-templates#cloud-storage-text-to-bigquery) doesn't apply as it uses a UDF and also runs in Batch mode, overwriting any existing BigQuery table rather than append.
Inspecting the original Dataflow job details from Dataprep shows a number of parameters (found in the metadata file) but I haven't been able to get those to work within my code. Here's an example of one such failed configuration:
import time
from google.cloud import storage
from googleapiclient.discovery import build
from oauth2client.client import GoogleCredentials
def dummy(event, context):
pass
def process_data(event, context):
credentials = GoogleCredentials.get_application_default()
service = build('dataflow', 'v1b3', credentials=credentials)
data = event
gsclient = storage.Client()
file_name = data['name']
time_stamp = time.time()
GCSPATH="gs://[path to template]
BODY = {
"jobName": "GCS2BigQuery_{tstamp}".format(tstamp=time_stamp),
"parameters": {
"inputLocations" : '{{\"location1\":\"[my bucket]/{filename}\"}}'.format(filename=file_name),
"outputLocations": '{{\"location1\":\"[project]:[dataset].[table]\", [... other locations]"}}',
"customGcsTempLocation": "gs://[my bucket]/dataflow"
},
"environment": {
"zone": "us-east1-b"
}
}
print(BODY["parameters"])
request = service.projects().templates().launch(projectId=PROJECT, gcsPath=GCSPATH, body=BODY)
response = request.execute()
print(response)
The above example indicates invalid field ("location1", which I pulled from a completed Dataflow job. I know I need to specify the GCS location, the template location, and the BigQuery table but haven't found the correct syntax anywhere. As mentioned above, I found the field names and sample values in the job's generated metadata file.
I realize that this specific use case may not ring any bells but in general if anyone has had success determining and using the correct startup parameters for a Dataflow job exported from Dataprep, I'd be most grateful to learn more about that. Thx.
I think you need to review this document it explains exactly the syntax required for passing the various pipeline options available including the location parameters needed... 1
Specifically with your code snippet the following does not follow the correct syntax
""inputLocations" : '{{\"location1\":\"[my bucket]/{filename}\"}}'.format(filename=file_name)"
In addition to document1, you should also review the available pipeline options and their correct syntax 2
Please use the links...They are the official documentation links from Google.These links will never go stale or be removed they are actively monitored and maintained by a dedicated team

How can we get result of summary report parameters (throughput, received & sent bytes) in JMeter script through non gui mode?

How can we get result of summary report parameters (throughput, received & sent bytes) in JMeter script through non gui mode? I have to implement the benchmarking on whole script rather than each thread to mark the status of script pass/fail by comparing the result to the static .csv file which contains the value of parameters .Kindly let me know the approach to opt.
The easiest way is going for JMeterPluginsCMD Command Line Tool, it can generate various tables and charts out of JMeter's .jtl results file.
So you will need to add the following command as the Post Build Step:
JMeterPluginsCMD --generate-csv SummaryReport.csv --input-jtl result.jtl --plugin-type AggregateReport
You can install JMeterPluginsCMD Command Line Tool using JMeter Plugins Manager

Artifactory Pipeline Step & Shared Libraries

I am trying to push a bunch of repeated Jenkinsfile pipeline steps into a shared library.
However i ran into an issue when moving an Artifactory build step; i get this error:
com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.lang.String out of START_OBJECT token
at [Source: N/A; line: -1, column: -1] (through reference chain: org.jfrog.hudson.pipeline.types.deployers.MavenDeployer["releaseRepo"])
at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:270)
I've created an example Jenkins project and a shared library showing the error.
I get the impression this means you can't run Artifactory setup/build within a shared library. However, i found a post that shows some things are obviously possible.
I can't find any examples where the deployer/run are actually in a shared library, however.
Any thoughts or suggestions would be appreciated.
Thanks
-B
The issue i encountered was one of GString interpolation. The invocation of the deployer(...) method required that the parameters be immutable at the time of execution.
To do this, my interpolated strings needed to be converted to immutable strings; resulting in this:
rtMaven.deployer(releaseRepo: "${config.releaseRepo}", snapshotRepo: "${config.snapshotRepo}", server: artServer)
Becoming this:
rtMaven.deployer(releaseRepo: config.releaseRepo.toString(), snapshotRepo: config.snapshotRepo.toString(), server: artServer)
-B

Resources