How to setup sonar scanner in Jenkins Declarative Pipeline - jenkins

I'm facing a problem in implementing SonarQube scanner for my repository in Jenkinsfile. I don't know where should I add the properties of SonarQube scanner in the Jenkinsfile.
I've set Jenkins locally on my windows system. The projects are purely based on Python, Ruby & React.
agent {label 'master'}
triggers {
GenricTrigger ([
genricVariables: [
key: 'pr_from_branch', value: '$.pullrequest.source.branch.name'],
[
expressionType: 'JsonPath',
regexpFilter: '',
defaultValue: ''],
token: 'test'])
}
options {
buildDiscarder (
logRotator(numToKeepStr:'5'))
}
stages {
stage ('Initialize & SonarQube Scan') {
steps {
def scannerHome = tool 'sonarScanner';
withSonarQubeEnv('My SonarQube Server') {
bat """
${scannerHome}/bin/sonar-runner.bat
pip install -r requirements.txt
"""
}
}
}
stage('Quality Gate') {
sleep time: 3000, unit: 'MILLISECONDS'
timeout(time: 1, unit: 'MINUTES') { // Just in case something goes wrong, pipeline will be killed after a timeout
def qg = waitForQualityGate() // Reuse taskId previously collected by withSonarQubeEnv
if (qg.status != 'OK') {
error "Pipeline aborted due to quality gate failure: ${qg.status}"
}
}
}
stage ('Smoke Test') {
steps {
bat """
pytest -s -v tests/home/login_test.py
currentBuild.result = 'SUCCESS'
"""
}
}
}
}
The properties include:
-----------------Sonarqube configuration........................
sonar.projectKey=<*****>
sonar.projectName=<project name>
sonar.projectVersion=1.0
sonar.login=<sonar-login-token>
sonar.sources=src
sonar.exclusions=**/*.doc,**/*.docx,**/*.ipch,/node_modules/,
sonar.host.url=http://<url>/
-----------------Sonar for bitbucket plugin configuration...................
sonar.bitbucket.repoSlug=<project name>
sonar.bitbucket.accountName=<name>
sonar.bitbucket.oauthClientKey=<OAuth_Key>
sonar.bitbucket.oauthClientSecret=<OAuth_secret>
sonar.analysis.mode=issues
I can manually add these properties in sonar-project.properties file and set this file in my project root directly but it will be running locally not on the server. So to avoid that I want to add these properties to Jenkinsfile

We run Sonar scanner as a Docker container but it should give you a fair idea of how to use your properties for the same in Jenkinsfile.
stage("Sonar Analysis"){
sh "docker pull docker.artifactory.company.com/util-sonar-runner:latest"
withSonarQubeEnv('sonarqube'){
sh "docker run --rm -v ${workspace}:/opt/spring-service -w /opt/spring-service -e SONAR_HOST_URL=${SONAR_HOST_URL} -e SONAR_AUTH_TOKEN=${SONAR_AUTH_TOKEN} docker.artifactory.company.com/util-sonar-runner:latest /opt/sonar-scanner/bin/sonar-scanner -Dsonar.host.url=${SONAR_HOST_URL} -Dsonar.login=${SONAR_AUTH_TOKEN} -Dsonar.projectKey=spring-service -Dsonar.projectName=spring-service -Dsonar.projectBaseDir=. -Dsonar.sources=./src -Dsonar.java.binaries=./build/classes -Dsonar.junit.reportPaths=./build/test-results/test -Dsonar.jacoco.reportPaths=./build/jacoco/test.exec -Dsonar.exclusions=src/test/java/**/* -Dsonar.fortify.reportPath=fortifyResults-${IMAGE_NAME}.fpr -Dsonar.password="
}
}

You run the pipeline step like this. The sonar server properties can be defined under the profile of the pom.xml file.
steps {
withSonarQubeEnv('SonarQube') {
sh 'mvn -Psonar -Dsonar.sourceEncoding=UTF-8 org.sonarsource.scanner.maven:sonar-maven-plugin:3.0.2:sonar'
}
}
The SonarQube scanner needs to be defined on Jenkins Global tool Configuration section.

Related

Jenkins using docker agent with environment declarative pipeline

I would like to install maven and npm via docker agent using Jenkins declarative pipeline. But When I would like to use below script Jenkins throws an error as below. It might be using agent none but how can I use node with docker agent via declarative pipeline jenkins.
ERROR: Attempted to execute a step that requires a node context while
‘agent none’ was specified. Be sure to specify your own ‘node { ... }’
blocks when using ‘agent none’.
I try to set agent any but this time I received an error "Still waiting to schedule task
Waiting for next available executor"
pipeline {
agent none
// environment{
proxy = https://
// stable_revision = sh(script: 'curl -H "Authorization: Basic $base64encoded"
// }
stages {
stage('Build') {
agent {
docker { image 'maven:3-alpine'}
}
steps {
sh 'mvn --version'
echo "$apigeeUsername"
echo "Stable Revision: ${env.stable_revision}"
}
}
stage('Test') {
agent { docker { image 'maven:3-alpine' image 'node:8.12.0' } }
environment {
HOME = '.'
}
steps {
script{
try{
sh 'npm install'
sh 'node --version'
//sh 'npm test/unit/*.js'
}catch(e){
throw e
}
}
}
}
// stage('Policy-Code Analysis') {
// steps{
// sh "npm install -g apigeelint"
// sh "apigelint -s wiservice_api_v1/apiproxy/ -f codeframe.js"
// }
// }
stage('Promotion'){
steps{
timeout(time: 2, unit: 'DAYS') {
input 'Do you want to Approve?'
}
}
}
stage('Deployment'){
steps{
sh "mvn -f wiservice_api_v1/pom.xml install -Ptest -Dusername=${apigeeUsername} -Dpassword=${apigeePassword} -Dapigee.config.options=update"
//sh "mvn apigee-enterprise:install -Ptest -Dusername=${apigeeUsername} -Dpassword=${apigeePassword} "
}
}
}
}
Basically your error message tells you everything you need to know:
ERROR: Attempted to execute a step that requires a node context while
‘agent none’ was specified. Be sure to specify your own ‘node { ... }’
blocks when using ‘agent none’.
so what is the issue here? You use agent none for your pipeline which means you do not specify a specific agent for all stages. An agent executes a specific stage. If a stage has no agent it can't be executed and this is your issue here.
The following 2 stage have no agent which means no docker-container / server or whatever where it can be executed.
stage('Promotion'){
steps{
timeout(time: 2, unit: 'DAYS') {
input 'Do you want to Approve?'
}
}
}
stage('Deployment'){
steps{
sh "mvn -f wiservice_api_v1/pom.xml install -Ptest -Dusername=${apigeeUsername} -Dpassword=${apigeePassword} -Dapigee.config.options=update"
//sh "mvn apigee-enterprise:install -Ptest -Dusername=${apigeeUsername} -Dpassword=${apigeePassword} "
}
}
so you have to add agent { ... } to both stage seperately or use a global agent like following and remove the agent from your stages:
pipeline {
agent {
docker { image 'maven:3-alpine'}
} ...
For further information see guide to set up master and agent machines or distributed jenkins builds or the official documentation.
I think you meant to add agent any instead of agent none, because each stage requires at least one agent (either declared at the top for the pipeline or per stage).
Also, I see some more issues.
Your Test stage specifies two images for the same stage.
agent { docker { image 'maven:3-alpine' image 'node:8.12.0' } } although, your stage is executing only npm commands. I believe only one of the image will be downloaded.
To clarify bit more on mkemmerz answer, your Promotion stage is designed correctly. If you plan to have an input step in the pipeline, do not add an agent for the pipeline because input steps block the executor context. See this link https://jenkins.io/blog/2018/04/09/whats-in-declarative/

Modify env.BRANCH_NAME variable with branch-env.BRANCH_NAME in jenkinsfile for a multibranch pipeline project

I have created a multibranch pipeline project and thus created jenkinsfile and put that in dev branch.
In one of the stage, I have to run mvn sonar:sonar -Dsonar.scm.branch=branch-${env.BRANCH_NAME} but it's giving error as bad substitution branch-${env.BRANCH_NAME}.
I need branch-${env.BRANCH_NAME} as a branch-name so that at sonar i can see branch-dev at branches section in sonar dashboard.
if i use mvn sonar:sonar -Dsonar.scm.branch=env.BRANCH_NAME then it provides output but it will act as short-lived branch in sonar. but at sonar we want branch as long-lived branch.
!/usr/bin/env groovy
pipeline {
agent { label 'ol73_slave-jdk8u192-git' }
options {
timestamps()
timeout(time: 2, unit: 'HOURS')
buildDiscarder(logRotator(numToKeepStr: '10'))
disableConcurrentBuilds()
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Unit Test and Code Scan') {
steps {
echo "*****JUnit Tests, JaCoCo Code Coverage, & SonarQube Code Qualiy Scan*****"
withMaven(jdk: 'jdk8_u192', maven: 'maven-3.3.9', mavenSettingsConfig: '79ecf9bd-8cbc-4d5e-b7d1-200241e16b52') {
sh '''
cd DARC
mvn clean package sonar:sonar -Dsonar.host.url=***** -Dsonar.login=******* -Dsonar.exclusions=file:**/src/test/** -B -Pcoverage -Dsonar.branch.name=branch-${env.BRANCH_NAME}
'''
}
}
}
}
}
In groovy Strings are encapsulated in single quotes '' and GStrings in double quotes ""
In order to do interpolation you need to be using GStrings. In your example, this should simply be
sh """
cd DARC
mvn clean package sonar:sonar -Dsonar.host.url=***** -Dsonar.login=******* -Dsonar.exclusions=file:**/src/test/** -B -Pcoverage -Dsonar.branch.name=branch-${env.BRANCH_NAME}
"""

Analyzing code with SonarQube from Jenkins pipeline while using docker container Sonnar Scanner

I want to perform SonarQube analysis of a git repository code and I want to use SonarScanner from Docker Container, not from within Jenkins Configuration.
I've tried to create this pipeline:
pipeline {
agent { docker { image 'emeraldsquad/sonar-scanner:latest' } }
stages {
stage('build && SonarQube analysis') {
steps {
withSonarQubeEnv('sonar.tools.devops.****') {
sh 'sonar-scanner \\ -Dsonar.projectKey=myProject \\ -Dsonar.sources=./src \\'
}
}
}
stage("Quality Gate") {
steps {
timeout(time: 1, unit: 'HOURS') {
// Parameter indicates whether to set pipeline to UNSTABLE if Quality Gate fails
// true = set pipeline to UNSTABLE, false = don't
// Requires SonarScanner for Jenkins 2.7+
waitForQualityGate abortPipeline: true
}
}
}
}
}
The build fails on stage build && SonarQube analysis, with build output:
Injecting SonarQube environment variables using the configuration: sonar.tools.devops.*****
[Pipeline] {
[Pipeline] sh
+ sonar-scanner ' -Dsonar.projectKey=myProject' ' -Dsonar.sources=./src' '\'
ERROR: Unrecognized option: -Dsonar.sources=./src
INFO:
INFO: usage: sonar-scanner [options]
INFO:
INFO: Options:
INFO: -D,--define <arg> Define property
INFO: -h,--help Display help information
INFO: -v,--version Display version information
INFO: -X,--debug Produce execution debug output
I would try to remove double backslashes between the arguments:
sh 'sonar-scanner -Dsonar.projectKey=myProject -Dsonar.sources=./src'
The backslashes escape spaces which won't be trucated by the shell and are added to the argument's name.

Running SonarQube Scanner on a Jenkins remote slave

I have a Docker container running Jenkins (2.150.1) and another Docker container running SonarQube (7.4). Jenkins is using the SonarQube Scanner for Jenkins plugin and the scanning is done on the Jenkins container. The Jenkinsfile for the project looks like this:
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'building...'
}
}
stage('Test') {
steps {
echo 'testing...'
withSonarQubeEnv ('SonarQube') {
sh '/var/jenkins_home/sonar-scanner/sonar-scanner-3.2.0.1227-linux/bin/sonar-scanner'
}
echo 'really finished testing2'
}
}
stage("Quality Gate") {
steps {
timeout(time: 1, unit: 'MINUTES') {
waitForQualityGate abortPipeline: true
}
}
}
stage('Deployment') {
steps {
echo 'deploying...'
}
}
}
}
To get the scanning to work as part of a Jenkins pipeline job, I manually installed sonar-scanner on the Jenkins container by downloading the zip file and unzipping it to: /var/jenkins_home/sonar-scanner/sonar-scanner-3.2.0.1227-linux
This is working well, but I want to improve it by:
taking out the harcoded path to sonar-scanner from my Jenkinsfile
specify a non local location of sonar-scanner because I now need to run the scan on another VM/container instead of on the Jenkins container
I tried using Manage Jenkins > Global Tool Configuration > SonarQube Scanner and updated my Jenkinsfile to use SONAR_RUNNER_HOME instead of the hard coded path, but that didn't work and I got an error that sonar-scanner can't be found.
In Manage Jenkins > Global Tool Configuration > SonarQube Scanner check install automatically.
Then go to Manage Jenkins > Configure System and add the following
The Name should be the same as the parameter in the line in your Jenkinsfile: withSonarQubeEnv('SonarQube')

Pass variables between Jenkins stages

I want to pass a variable which I read in stage A towards stage B somehow. I see in some examples that people write it to a file, but I guess that is not really a nice solution. I tried writing it to an environment variable, but I'm not really successful on that. How can I set it up properly?
To get it working I tried a lot of things and read that I should use the """ instead of ''' to start a shell and escape those variables to \${foo} for example.
Below is what I have as a pipeline:
#!/usr/bin/env groovy
pipeline {
agent { node { label 'php71' } }
environment {
packageName='my-package'
packageVersion=''
groupId='vznl'
nexus_endpoint='http://nexus.devtools.io'
nexus_username='jenkins'
nexus_password='J3nkins'
}
stages{
// Package dependencies
stage('Install dependencies') {
steps {
sh '''
echo Skip composer installation
#composer install --prefer-dist --optimize-autoloader --no-interaction
'''
}
}
// Unit tests
stage('Unit Tests') {
steps {
sh '''
echo Running PHP code coverage tests...
#composer test
'''
}
}
// Create artifact
stage('Package') {
steps {
echo 'Create package refs'
sh """
mkdir -p ./build/zpk
VERSIONTAG=\$(grep 'version' composer.json)
REGEX='"version": "([0-9]+.[0-9]+.[0-9]+)"'
if [[ \${VERSIONTAG} =~ \${REGEX} ]]
then
env.packageVersion=\${BASH_REMATCH[1]}
/usr/bin/zs-client packZpk --folder=. --destination=./build/zpk --name=${env.packageName}-${env.packageVersion}.zpk --version=${env.packageVersion}
else
echo "No version found!"
exit 1
fi
"""
}
}
// Publish ZPK package to Nexus
stage('Publish packages') {
steps {
echo "Publish ZPK Package"
sh "curl -u ${env.nexus_username}:${env.nexus_password} --upload-file ./build/zpk/${env.packageName}-${env.packageVersion}.zpk ${env.nexus_endpoint}/repository/zpk-packages/${groupId}/${env.packageName}-${env.packageVersion}.zpk"
archive includes: './build/**/*.{zpk,rpm,deb}'
}
}
}
}
As you can see the packageVersion which I read from stage Package needs to be used in stage Publish as well.
Overall tips against the pipeline are of course always welcome as well.
A problem in your code is that you are assigning version of environment variable within the sh step. This step will execute in its own isolated process, inheriting parent process environment variables.
However, the only way of passing data back to the parent is through STDOUT/STDERR or exit code. As you want a string value, it is best to echo version from the sh step and assign it to a variable within the script context.
If you reuse the node, the script context will persist, and variables will be available in the subsequent stage. A working example is below. Note that any try to put this within a parallel block can be of failure, as the version information variable can be written to by multiple processes.
#!/usr/bin/env groovy
pipeline {
environment {
AGENT_INFO = ''
}
agent {
docker {
image 'alpine'
reuseNode true
}
}
stages {
stage('Collect agent info'){
steps {
echo "Current agent info: ${env.AGENT_INFO}"
script {
def agentInfo = sh script:'uname -a', returnStdout: true
println "Agent info within script: ${agentInfo}"
AGENT_INFO = agentInfo.replace("/n", "")
env.AGENT_INFO = AGENT_INFO
}
}
}
stage("Print agent info"){
steps {
script {
echo "Collected agent info: ${AGENT_INFO}"
echo "Environment agent info: ${env.AGENT_INFO}"
}
}
}
}
}
Another option which doesn't involve using script, but is just declarative, is to stash things in a little temporary environment file.
You can then use this stash (like a temporary cache that only lives for the run) if the workload is sprayed out across parallel or distributed nodes as needed.
Something like:
pipeline {
agent any
stages {
stage('first stage') {
steps {
// Write out any environment variables you like to a temporary file
sh 'echo export FOO=baz > myenv'
// Stash away for later use
stash 'myenv'
}
}
stage ("later stage") {
steps {
// Unstash the temporary file and apply it
unstash 'myenv'
// use the unstashed vars
sh 'source myenv && echo $FOO'
}
}
}
}

Resources