How to do Snyk code test in Jenkins pipeline? - jenkins

I am helping our DevOps team integrate Snyk into the Jenkins pipelines for SAST. By default, it seems like this Snyk plugin is doing snyk test (which does open-source dependency scans) and appends the additional arguments provided with it. I identified this behavior by checking the console log where the actual command ran was displayed. We actually want it to do the source code scan snyk code.
The command I observed in the console log is this: <jenkins tools installation path>/snyk-linux test --json --severity-threshold=high --file=<path>/package.json; The snyk-linux test part seems to be predefined.
Can someone please help me regarding this?

As you have correctly observed, the Snyk Security Jenkins plugin only offers access to the Snyk CLI snyk test command and nothing else.
Currently, the only way to do this is to talk with the Snyk CLI directly.
pipeline {
agent any
environment {
SNYK_HOME = tool name: 'Snyk'
}
stages {
stage('Snyk Code') {
steps {
sh "${SNYK_HOME}/snyk-linux test"
}
}
}
}
Of course, you also need to expose the token in an environment variable.

Related

Jenkins - How to access Build number variable and use it as a Prefix/Suffix of a log name during Post build actions

I am trying to use Jenkins' Build Number in the naming of a Log that I would want to be saved as a post build action
Will the below format work
C:\Jenkins\workspace\Jmeter_Jenkins_Test_Job\Jenkins_Results\"${env.BUILD_NUMBER}"results.jtl
As per Building a software project wiki article the environment variable you're looking for is BUILD_NUMBER and in case of Windows operating system you can access it as:
%BUILD_NUMBER%
so if you want to amend JMeter result file name to include build number you can do something like:
jmeter -n -t /path/to/test.jmx -l /path/to/result-%BUILD_NUMBER%-.jtl
and in the runtime the variable will be evaluated to the current Jenkins build number:
More information just in case: Continuous Integration 101: How to Run JMeter With Jenkins
Without having laid my hands on a Jenkins installation for quite some time:
Yes, you can do that and it has been done before!
You could do something like:
pipeline {
agent any
stages {
stage('test') {
sh 'path/to/jmeter.bat -n -t ${env.WORKSPACE}my_test.jmx -l my_test${env.BUILD_ID}_${env.BUILD_NUMBER}.jtl'
}
}
}
I would propose to create the HTML dashboard report first though and then publish that in Jenkins - you could use https://jenkins.io/doc/pipeline/steps/htmlpublisher/ to do that. Further you should avoid absolute paths in favor of using the WORKSPACE environment variable (see https://jenkins.io/doc/book/pipeline/jenkinsfile/#using-environment-variables for reference).
If you need some general idea of how to run the test via Jenkins you could have a look at https://code-maven.com/jenkins-pipeline-running-external-programs and https://jmeter.apache.org/usermanual/get-started.html#non_gui
If you already tried to achieve something and need more specific help, please come forward with some more detail.

Find artifacts in post-build actions

In a nutshell:
How can I access the location of the produced artifacts within a shell script started in a build or post-build action?
The longer story:
I'm trying to setup a jenkins job to automate the building and propagation of debian packages.
So far, I was already successfull in using the debian-pbuilder plugin to perform the build process, such that jenkins presents the final artifacts after successfully finishing the job:
mypackage_1+020200224114528.NOREV.4_all.deb
mypackage_1+020200224114528.NOREV.4_amd64.buildinfo
mypackage_1+020200224114528.NOREV.4_amd64.changes
mypackage_1+020200224114528.NOREV.4.dsc
mypackage_1+020200224114528.NOREV.4.tar.xz
Now I would like to also automate the deployment process into the local reprepro repository, which would actually just require a simple shell script invocation, I've put together.
My problem: I find no way to determine the artifact location for that deployment script to operate on. The "debian-pbuilder" plugin generates the artifacts in a temporary directory ($WORKSPACE/binaries.tmp15567690749093469649), which changes with every build.
Since the artifacts are listed properly in the finished job status view, I would expect that the artifact details are provided to the script (e.g. by environment variables). But that is obvously not the case.
I've already search extensively for a solution, but didn't find anything helpful.
Or is it me (still somewhat a Rookie in Jenkins), following a wron approach here?
You can use archiveArtifacts. You have binaries.tmp directory in the Workspace and you can use it, but before execute clear workspace using deleteDir().
Pipeline example:
pipeline {
agent any
stages {
stage('Build') {
steps {
deleteDir()
...
}
}
}
post {
always {
archiveArtifacts artifacts: 'binaries*/**', fingerprint: true
}
}
}
You can also check https://plugins.jenkins.io/copyartifact/

How to disable "build now" option?

For a given scripted pipeline(jenkins), the pipeline should only get triggered through webhook from GitLab
Build Now option should be disabled for that pipeline.
Can we configure Jenkins, to disable Build Now option for a specific pipeline script job in jenkins?
EDIT: Here the solution with an scripted Pipeline:
node {
def userIdCause = currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause')
stage("Authorize Usage") {
if (userIdCause.size()) {
error('Aborting Build due to manual start - thats not permitted!')
}
}
}
How about the following solution without any extra plugin on an declarative pipeline:
pipeline {
...
stages {
stage ("Authorize Usage") {
when { expression { getCause() == "USER" } }
steps {
currentBuild.description = 'Aborting Build due to manual start - thats not permitted!'
error('Aborting Build due to manual start - thats not permitted!')
}
}
...
}
Have taken a look at this plug-in supplied on the Jenkin's site? Matrix Authorization Strategy Plugin :
Matrix Strategy
Specifically this sectionL Allow configuring per-agent permissions. This allows e.g. restricting per-agent build permissions when using the Authorize Project plugin (JENKINS-46654)
Not ideal, but if this is a 'freestyle pipeline job'
a quick workaround is to add a build step "Execute shell" as first step. You can use this to prevent a build, when noting has changed.
Every time your sources changes and you push to your repo, a build will have been triggered and as there are changes this script will not exit.
When you click the 'Build now', nothing should have changed in your repo (as the only way it can is through a push which would then trigger a build) it will causes an exit, and fail the build.
if [[ $GIT_COMMIT -eq $GIT_PREVIOUS_COMMIT ]]
then
echo "Exiting build - Nothing has changed"
echo "This is to prevent the usage of Jenkins 'build now'"
exit 1
fi
EDIT: This is the answer to the question of user #mohet in the comments of my other answer because it was to long for the comment section (https://stackoverflow.com/a/55058788/7746963).
The currentBuild variable, which is of type RunWrapper, may be used to refer to the currently running build...
Source: https://opensource.triology.de/jenkins/pipeline-syntax/globals .
hudson.model is the package name of most corresponding core jenkins classes. 'Hudson' because jenkins was once cloned from the codebase of his ancestor named 'hudson'.
You can look up them here: https://javadoc.jenkins.io/hudson/model/package-summary.html .
There you will also find https://javadoc.jenkins.io/hudson/model/Cause.UserIdCause.html . To specify directly the package$classname in some methods like getbuildcauses is the straightforward thought of jenkins dev Team. This reduces the failure potential and makes the code better readable and understandable.

How to test Jenkinsfile

I am trying to write the test cases to validate the Jenkinsfile, But the load script function not working expecting the extension to be provided and throwing ResourceException exception loadScript("Jenkinsfile")
Is their better way to test the Jenkinsfile
The problem is that there are not enough tools for the development of pipelines. Pipelines is DSL and it imposes a restrictions.
There is an interesting approach to using flags. For example, test which defines outside pipeline(in job). If test=true, a pipeline change some "production" logic to "test" - select another agent, load artifacts into another repository, run another command and so on.
But recently appeared Pipeline Unit Testing Framework. It allows you to unit test Pipelines and Shared Libraries before running them in full. It provides a mock execution environment where real Pipeline steps are replaced with mock objects that you can use to check for expected behavior.
Useful links:
Jenkins World 2017: JenkinsPipelineUnit: Test your Continuous Delivery Pipeline
Pipeline Development Tools
You can validate your Declarative Pipeline locally thanks to Jenkins built-in features.This can be done using a Jenkins CLI command or by making an HTTP POST request with appropriate parameters.
The command is the following:
curl -s -X POST -F "jenkinsfile=<YourJenkinsfile" \
https://user:password#jenkins.example.com/pipeline-model-converter/validate
For a practical example follow this guide:
https://pillsfromtheweb.blogspot.com/2020/10/validate-jenkinsfile.html

Jenkins 2 pipeline deploying to udeploy

I am creating a CI/CD pipeline. I am trying to create a groovy function in order to deploy a build to udeploy.
I know I will need to pass the parameters used in to the function such as:
udeployServer,
component,
artifactDirectory,
version,
deployApplication,
environment and
deployProcess.
I was wondering has anyone tried to implement this or has anyone any idea how I should approach this?
Thanks
I don't know anything about udeploy servers but I do know there is no pipeline plugin for udeploy, which means that you will not have a function such as :
udeploy: server=yourserver component=yourcomponent artifactDirectory=...
However Jenkins allow you to use shell commands inside your groovy pipeline, so you should be able to do pretty much everything you need. So I guess the real question is how do you usually deploy a build to udeploy ? Do you do it via a REST API, do you push a file via FTP, ... ?
Jenkins build will be pretty straightforward, have a look at how to checkout and build using Jenkins pipeline.
An example pipeline could look like :
{
stage 'Build'
def mvnHome = tool 'M3'
sh "${mvnHome}/bin/mvn clean install"
//... Some other stages as needed...
stage 'Deploy'
sh "execute sh deploy script here..."
}
... where you deploy stage could use other plugins to copy files to your server, run REST API requests, etc. While writing a pipeline, have a look at Pipeline Syntax link for a Snippet Generator giving more detailed information about existing plugins.

Resources