I am trying to invoke AWS Lambda function using Jenkinsfile with the payload. Not able to inject the instance_ip variable into the payload.
def instance_ip = "10.X.X.X"
pipeline {
agent any
stages {
stage('Terminate Machine') {
steps {
script {
sh(script: 'aws lambda invoke --function-name terminate-instance --payload '{"private_ip_address":"${instance_ip}" }')
}
}
}
}
}
You need to scape some characters and set --incvocation-type as Event:
sh(script: "aws lambda invoke \
--function-name 'terminate-instance' \
--invocation-type Event \
--payload '{ \"private_ip_address\":\"${instance_ip}\" }' \
/tmp/response.json")
I was facing the same issue. So to pass the payload, in Jenkins file, I am creating a Json payload file using echo and then passing it to the aws cli. It may not be the cleanest solution but it works. Here's what my code looks like:
echo '{ "tagKey"':'"'"${tagKey}"'"', '"tagValue"':'"'"${tagValue}"'"', '"region"':'"'"${region}"'"' } > json.json
cat json.json
ls -alrt
aws lambda invoke --function-name tag_remediator --cli-binary-format raw-in-base64-out --payload file://json.json out_"$tagKey".txt --region "${region}"
Related
I have a question about how to use File Sepc in an API Call in JFrog.
I used Jenkins Artifactory Plugin to upload or download artifacts to JFrog, I try to rewrite the function using JFrog API (GET/PUT) to do the same thing.
but I have now a problem, for some artifacts I used file Spec to set some properties and finally I upload this file spec.
"files": [
{
"pattern": "${file}",
"target": "${target}" """
if (runID) {
uploadSpec += """,
"props": "artifactId=${runID}"
"""
}
uploadSpec += """
}
]
as you can see this artifactId.
in this case when I use JFrog API to upload artifacts how should I set properties?
sh """
curl -sSf -u user:pw -X PUT -T ${zipFile} 'https://${config.artifactory.name}.xxxx:443/artifactory/${path}'
"""
How can I call put api and set "props": "artifactId=${runID}"
any solutions??
First - if you can use the JFrog CLI - you should use it, because it makes it simpler and provides some advanced features out-of-the-box, such as batch parallel uploads/downloads, file-specs, attaching properties, build-info, authentication, etc.
If you still want to use the Artifactory API directly for setting properties, which is indeed a viable good option, you can do one of the following:
Add the properties as matrix parameters as part of the upload (deploy) API call.
In your case, it should be something like:
sh """
curl -sSf -u user:pw -X PUT -T ${zipFile} 'https://${config.artifactory.name}.xxxx:443/artifactory/${path};artifactId=${runID}'
"""
Note the ;key=value in the end of the URL.
Do a second call, after the upload, to set the item properties.
In your case, it should be something like:
Using the set item properties API -
sh """
curl -sSf -u user:pw -X PUT 'https://${config.artifactory.name}.xxxx:443/artifactory/api/storage/${path}?properties=artifactId=${runID}'
"""
or, using the update item properties API-
sh """
curl -sSf -u user:pw -X PATCH 'https://${config.artifactory.name}.xxxx:443/artifactory/api/metadata/${path}' -d '{ "props": { "artifactId" : "${runID}" } }'
"""
For more information, see:
Working with JFrog Properties
Using Properties in Deployment and Resolution
Artifactory REST API - Item Properties
Is there any method or solution to invoke the GitHub Actions Workflow from the Jenkins Declarative Pipeline?
What the Jenkinsfile should include to call said action?
You should be able to call the API to Create a workflow dispatch event.
See "How to trigger a Github action with an HTTP request" from Riku Rouvila
pipeline {
agent any
stages {
stage("Using curl example to trigger a GitHub Action") {
steps {
script {
final String url = "https://api.github.com/repos/<USERNAME>/<REPO>/dispatches"
final String response = sh(script: "curl --request POST \
--url '$url' \
--header 'authorization: Bearer <TOKEN>' \
--data '{"event_type": "hello"}'", returnStdout: true).trim()
echo response
}
}
}
}
}
Replace <USERNAME>/<REPO> and <TOKEN> by the approate value (<TOKEN> would be a Jenkins secret)
And the curl command should be in one line (the \ multiline presentation is here for readability)
I'm running a curl command in my Jenkinsfile.
post {
success {
script {
sh '''
|SCORE=+1
|GERRIT_COMMENT="$(cat <<-EOL
|Sonar result was: SUCCESS
|Report: ${Jenkins_URL}/job/${JOB_NAME}/${BUILD_NUMBER}/artifact/report1.txt
|EOL
|)"
|curl -s -u ${apiToken}: ${Sonar_URL}/api/measures/component_tree?ps=100&s=qualifier,name&component=sonarqube&metricKeys=ncloc,bugs,vulnerabilities,code_smells,security_hotspots,coverage,duplicated_lines_density&strategy=children | json_pp -json_opt pretty,canonical > report1.txt
|echo "Voting unsuccessful"
'''.stripMargin().stripIndent()
archiveArtifacts artifacts: 'report1.txt', fingerprint: true
echo 'I Succeeded'
}
}
But I get the error
malformed JSON string, neither array, object, number, string or atom, at character offset 0 (before "(end of string)") at /usr/bin/json_pp
I can't use jq as it's not installed and installing it isn't an option.
The curl command works fine on my terminal but is failing in my Jenkins pipeline.
Also, when I do this instead, it works.
post {
success {
script {
sh '''
|SCORE=+1
|GERRIT_COMMENT="$(cat <<-EOL
|Sonar result was: SUCCESS
|Report: ${Jenkins_URL}/job/${JOB_NAME}/${BUILD_NUMBER}/artifact/report1.txt
|EOL
|)"
|echo "Voting unsuccessful"
'''.stripMargin().stripIndent()
sh """
curl -s -u ${apiToken}: '${Sonar_URL}/api/measures/component_tree?ps=100&s=qualifier,name&component=sonarqube&metricKeys=ncloc,bugs,vulnerabilities,code_smells,security_hotspots,coverage,duplicated_lines_density&strategy=children' | json_pp -json_opt pretty,canonical > report1.txt
"""
archiveArtifacts artifacts: 'report1.txt', fingerprint: true
echo 'I Succeeded'
}
}
But it throws a warning in the console output.
Warning: A secret was passed to "sh" using Groovy String interpolation, which is insecure. Affected argument(s) used the following variable(s): [apiToken]
What am I doing wrong, please?
In a Jenkins pipeline, how would you properly pass a JSON response using curl into a file?
I recommend to not use shell scripts whenever it is possible. Shell scripts are not cross platform and require installing additional tools (e.g. curl).
In your case the curl call could be replaced by the httpRequest step.
First let's replace the curl call and saves the result in a componentTree.json file:
httpRequest(
url: "${Sonar_URL}/api/measures/component_tree?ps=100&s=qualifier,name&component=sonarqube&metricKeys=ncloc,bugs,vulnerabilities,code_smells,security_hotspots,coverage,duplicated_lines_density&strategy=children",
authorization: 'id-of-credentials-which-was-used-to-create-the-apiToken-variable',
outputFile: 'componentTree.json'
)
You want to format the JSON data in a human-readable format, so let's use the readJSON and writeJSON steps:
def json = readJSON(file: 'componentTree.json')
writeJSON(json: json, file: 'report1.txt', pretty: 4)
Now the report1.txt file contains JSON formatted with indent 4.
The componentTree.json file is written and read only once, so let's decrease the number of the IO operations:
def response = httpRequest(
url: "${Sonar_URL}/api/measures/component_tree?ps=100&s=qualifier,name&component=sonarqube&metricKeys=ncloc,bugs,vulnerabilities,code_smells,security_hotspots,coverage,duplicated_lines_density&strategy=children",
authorization: 'id-of-credentials-which-was-used-to-create-the-apiToken-variable'
)
def json = readJSON(text: response.content)
writeJSON(json: json, file: 'report1.txt', pretty: 4)
About the warning:
Warning: A secret was passed to "sh" using Groovy String interpolation, which is insecure. Affected argument(s) used the following variable(s): [apiToken]
Secrets never should be interpolated because they may contain special characters which could be interpreted. Example:
my secret: My' https://example.org; cat /etc/passwd; echo \
command: curl -u '${password}' https://private.server/path/file.txt
After the interpolation the following command is called:
curl -u 'My' https://example.org; cat /etc/passwd; echo \' https://private.server/path/file.txt
There are two options to fix it:
if apiToken is an environment variable:
sh "curl -s -u \$apiToken: '${Sonar_URL}/api/measures/component..."
if apiToken is a Groovy variable:
withEnv(["CREDENTIALS=${apiToken}"]) {
sh "curl -s -u \$CREDENTIALS: '${Sonar_URL}/api/measures/component..."
}
In both cases the dollar sign ($) is escaped before the credentials which means that shell script will resolve it (it will be taken from environment variables).
Need some help on fetching the GitHub payload into the Jenkins file without installing any plugin.
If anyone can provide the Jenkins file code snippet to access the GitHub payload from the webhook. it would be of great help.
I am able to call the Jenkins job from GitHub webhook. but need the payload as well to process further.
Any help would be appreciated. Thanks.
Please find the below groovy script:
stage('Pull Request Info') {
agent {
docker {
image 'maven'
args '-v $HOME/.m2:/home/jenkins/.m2 -ti -u 496 -e MAVEN_CONFIG=/home/jenkins/.m2 -e MAVEN_OPTS=-Xmx2048m'
}
}
steps {
script {
withCredentials([usernameColonPassword(credentialsId: "${env.STASH_CREDENTIAL_ID}",
variable: 'USERPASS')]) {
def hasChanges = maven.updateDependencies()
if (hasChanges != '') {
def pr1 = sh(
script: "curl -s -u ${"$USERPASS" } -H 'Content-Type: application/json' https://xx.example/rest/some/endpoint",
returnStdout: true
)
def pr = readJSON(text: pr1)
println pr.fromRef
}
}
}
}
}
Above script uses, curl to fetch the details about Pull request. I have stored the credentials in the jenkins and created an environment variable for the credentialId.
You can replace the url with your endpoint. You can also modify the script to use jq if you have jq installed in the machine.
In this case, I'm using readJSON to parse the JSON which is part of pipeline utilities plugin. My question would be, why not use the plugin as it provides the needed functionalities?
If you still dont want to use plugin, take a look at json parsing with groovy.
I'm trying to call a remote job from one Jenkins server to another, I have this working fine via a shell script. However, trying to translate it into a Jenkinsfile is causing me issues. The environment variable is always "null" when used inside of a stage, even thought this article says it should be globally available?
pipeline {
agent any
/* get crumb for CSRF */
environment {
def crumb = sh 'curl https://jenkins-remote/crumbIssuer/'
}
stages {
/* call remote job */
stage("remote") {
steps {
sh "curl -X POST -H ${env.crumb} https://jenkins-remote/buildWithParameters?foo"
}
}
}
}
The trimmed output looks like:
[remote_pipeline] Running shell script
+ curl -X POST -H null
I am using Jenkins v2.89.4, new "Pipeline" job, "pipeline script".
Thanks to #TrafLaf for pointing out the variable is null because it does not get set to the output of the curl command. My hacky solution was this:
environment {
def crumbRequest = sh 'curl https://jenkins-remote/crumbIssuer/ > crumbHeader'
crumbHeader = readFile('crumbHeader').trim()
}
As per the official documentation, This is how you define environment variables.
pipeline {
agent any
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
stages {
stage('Build') {
steps {
echo "${env.DB_ENGINE}" # to access
}
}
}
}
But you have coded wrong,
environment {
def crumb = sh 'curl https://jenkins-remote/crumbIssuer/'
}
So please do the rest.
The sh task can now return output, so in theory the following should work (untested):
environment {
crumb = sh(script: 'curl https://jenkins-remote/crumbIssuer/',
returnStdout: true).trim()
}