configure ci_secrets for Jenkins ci - jenkins

ci_secrets is (https://github.com/pmarlow/ci_secrets) is a repo secret scanning tool which can easily be integrated with Travis and Gitlab ci without a need for a persistent server.
Configuring this into the Jenkins pipeline is a bit tricky though as Jenkins does not support the environment variables like
"TRAVIS_COMMIT_RANGE" and/or
which is required to determine the latest-scanned commit from the first commit in the range.
is there a way to implement this in a Jenkins pipeline?
for example:
script:
- export COMMIT_RANGE=${TRAVIS_COMMIT_RANGE:-"000000000000000000000000000000000000"}
- export LAST_COMMIT=${COMMIT_RANGE%%.*}
- ci_secrets --since $LAST_COMMIT --includeMergeCommit --log INFO

Related

How to set SDK release when using Bitbucket repository integration

I have installed Bitbucket integration on sentry and used the Bitbucket pipeline to automatically notify and associate releases with commits as described here
I have also set up source maps to be uploaded as seen below:
sentry-cli releases files $BITBUCKET_COMMIT upload-sourcemaps build
The Bitbucket pipeline and the source map upload both use the $BITBUCKET_COMMIT as the identifier.
I am trying to figure out how to configure the SDK release to use this variable as my current set up is below:
if (process.env.NODE_ENV.toString().toLowerCase() === 'production') {
Sentry.init({
dsn: process.env.REACT_APP_SENTRY_DSN,
});
}
I found out how to do this. The BITBUCKET_COMMIT is an environment variable available in the bitbucket pipeline during build so I made this available to my docker container by passing it as an argument in the Docker build step.
docker build --build-arg release=$BITBUCKET_COMMIT
I could then make the passed variable available to my React build command through the DockerFile
//DockerFile
ENV BITBUCKET_COMMIT=$release
Then within my package.json I set the variable during build
"build": "REACT_APP_SENTRY_DSN=$BITBUCKET_COMMIT"

Triggering the Jenkins job from the GitLab pipeline stage and on successfully completion of the job move to next stage

Can you please help, I have the following scenario and I went through many videos, blogs but could not find anything matching with my use-case
Requirement:
To write a CI\CD pipeline in GitLab, which can facilitate the following stages in this order
- verify # unit test, sonarqube, pages
- build # package
- publish # copy artifact in repository
- deploy # Deploy artifact on runtime in an test environment
- integration # run postman\integration tests
All other stages are fine and working but for the deploy stage, because of a few restrictions I have to submit an existing Jenkins job using Jenkin remote API with the following script but the problem that script returns an asynchronous response and start the Jenkins job and deploy stage completes and it moves to next stage (integration).
Run Jenkins Job:
image: maven:3-jdk-8
tags:
- java
environment: development
stage: deploy
script:
- artifact_no=$(grep -m1 '<version>' pom.xml | grep -oP '(?<=>).*(?=<)')
- curl -X POST http://myhost:8081/job/fpp/view/categorized/job/fpp_PREP_party/build --user mkumar:1121053c6b6d19bf0b3c1d6ab604f22867 --data-urlencode json="{\"parameter\":[{\"name\":\"app_version\",\"value\":\"$artifact_no\"}]}"
Note: Using GitLab CE edition and Jenkins CI project service is not available.
I am looking for a possible way of triggering the Jenkins job from the pipeline and only on successful completion of the Jenkins job my integration stage starts executing.
Thanks for the help!
Retrieving the status of a Jenkins job that is triggered programmatically through the remote access API is notorious for not being quite convoluted.
Normally you would expect to receive in the response header, under the Location attribute, a url that you can poll to get the status of your request, but unfortunately there are some in-between steps to reach that point. You can find a guide in this post. You may also have a look in this older post.
Once you have the url, you can pool and parse the status job and either sh "exit 1" or sh "exit 0" in your script to force the job that is invoking the external job to fail or succeed, depending on how you want to assert the result of the remote job

Jenkins Multibranch Pipeline: script / jenkinsfile as svn external

I have a multibranch pipeline in Jenkins. I want to include my script file (jenkinsfile) as svn file external into my development branches to organize the script centralized for all branches. Unfortunately the scan of the multibranch pipeline isn't able to find the script file as it is only looking inside the declared branch and not in the included svn external locations.
Has anyone an idea how can I fix this?
Below is an example of my svn structure, job config and further information.
SVN:
root/
scripts/
jenkinsfile
code/
version1/
branchX/
...
version11/
branchY/
...
SVN external property for branchX, branchY, etc.
Local path: jenkinsfile
URL: ^/scripts/jenkinsfile
Revision Peg: 12345
Multibranch job configuration:
Subversion
Project Repository Base: http://.../root/code/
Include branches: version1/branchX, version11/branchY
Build configuration
Mode: by Jenkinsfile
Script path: jenkinsfile
Log message of scan in multibranch pipeline:
...
Checking candidate branch /code/version1/branchX#HEAD
‘jenkinsfile’ not found
Does not meet criteria
...
I already tried to disable the lightweight checkout of the subversion scm plugin according to this advice:
Multibranch pipeline with jenkinsfile in svn:external
(I've added -Djenkins.scm.impl.subversion.SubversionSCMFileSystem.disable=true under <service><arguments>... in jenkins.xml)
But jenkins is still not able to find the script. And in fact if I put my script directly in e.g. branchX the disabled lightweight checkout leads to a double checkout into my workspace (first one to read the script file and second one as it's my first stage in the script itself).
Maybe my whole setup is wrong too or not the ideal way of doing?
I would be pleased about your help and tips. Thanks and Greetings!
If you are working on a linux or bsd(osx) system, you could create a hard-link from root/scripts/jenkinsfile to root/code/version#/branchX/jenkinsfile for each active branch
That way, each branch will have its own jenkinsfile available locally, enabling you to use the lightweight checkout, and any change you introduce to the jenkinsfile in any location will be available to all other branches (the file system will keep a single copy of the file, regardless of being accessible form many different locations).
The bash command to create such link will be
ln root/scripts/jenkinsfile root/code/version#/branchX/jenkinsfile
You will need to remember to create a new link each time a branch is created, or automate that using hooks

How to insert properties in Jenkinsfile in multibranch pipeline?

I configured jenkins multibranch pipeline on Jenkins version 2.60.2
I'm looking for a way to keep my passwords in jenkins multibranch pipline configuration, so Jenkinsfile could take them as parameters for execution of its stages. Is there a way to set these properties within the jenkins UI?
I found a similar question here, but I think there is a more preferred way.
Thanks
For using credentials in a Jenkins Pipeline, there are a couple plugins that I would consider essentially part of the Jenkins Core (even though they are plugins):
Credentials Plugin
Credentials Binding Plugin
These can combine to give you a way to administratively manage credentials as well as provide for ways to consume them inside of Jobs. There are additional plugins built on top of these providing credential type implementations. For example, the SSH Credentials Plugin allows you to store SSH credentials in Jenkins.
The Credentials Binding Plugin provides the withCredentials step. The documentation has some examples on how you could use it. Here is an example from the documentation:
node {
withCredentials(
[
usernameColonPassword(
credentialsId: 'mylogin',
variable: 'USERPASS'
)
]
) {
sh '''
set +x
curl -u $USERPASS https://private.server/ > output
'''
}
}

How can I retrieve the published Artifacts from Artifactory into my jenkins pipeline script

I try to get the list of published Artifacts from my deploy onto my Artifactory.
I tried to do so over the BuildInfoAccessor, but the current version is lacking the getDeployedArtifacts() function.
I even tried to read the jenkins build.log object, but it somehow misses the prints from artifactory-plugin on which artifacts are deployed.
Can someone give me a hint on where to look at or an example ?
Am not sure is there any better way to print the BuildInfo from the artifactory-jenkins plugin itself. You can get the published info of the jenkins build via artifactory rest api.
Artifactory Rest Api
You can get the build number from the jenkins environment variable ${BUILD_NUMBER} and make a http get call via sh curl/other suitable step if any in your pipeline script.
sh "curl http://artifactory.org.net/api/build/my-build/${BUILD_NUMBER}"
Make use of withCredential step to pass username/password.
Caution: I have just made answers from my theoretical knowledge.

Resources