How to use multiple tools in Jenkins Pipeline - jenkins

I need to use nodejs as well as terraform tools on build stages.The declarative pipeline I used is:
pipeline{
agent any
tools { nodejs "node12.14.1" terraform "terraform-v0.12.19"}
...
Only nodejs tool can be used. terraform is not installed and gives command not found error.

We need to specify each tools on new line instead.
pipeline {
agent any
tools {
nodejs "node12.14.1"
terraform "terraform-v0.12.19"
}
...

Related

Jenkins avoid tool installation if it is installed already

Not a jenkins expert here. I have a scripted pipeline where I have tool installed (Node). Unfortunately it was configured to pull in other dependencies which takes 250sec in overall now. I'd like to add a condition to avoid this installation if it(Node with packages) was already installed previously, but don't know where to start. Perhaps jenkins stores meta info from prev runs that can be checked?
node {
env.NODEJS_HOME = "${tool 'Node v8.11.3'}"
env.PATH = "${env.NODEJS_HOME}/bin:${env.PATH}"
env.PATH = "/opt/xs/bin:${env.PATH}"
// ...
}
Are you using dynamic jenkins agents (docker containers)? In this case tools will be installed everytime you run build.
Mount volumes to containers, use persistant agents or build your own docker image with installed nodejs.
As I see you use workaround to install nodejs tool.
Jenkins supports it native way (declarative style):
pipeline {
agent any
tools {
nodejs 'NodeJS_14.5'
}
stages {
stage ('nodejs test') {
steps {
sh 'npm -v'
}
}
}
}
On first run tools will be installed. On next ones - will not since it is installed.

Jenkins differences between tools and docker agent

Sorry it might be a simple question but what is the differences between using tools and docker agent.I think using docker agent is much more flexible instead of using tools. When should I use docker agent or tools?
Tools
pipeline {
agent any
tools {
maven 'Maven 3.3.9'
jdk 'jdk8'
}
stages {
stage ('Initialize') {
steps {
sh '''
echo "PATH = ${PATH}"
echo "M2_HOME = ${M2_HOME}"
'''
}
}
stage ('Build') {
steps {
sh 'mvn -Dmaven.test.failure.ignore=true install'
}
Docker Agent
pipeline {
agent none
stages {
stage('Back-end') {
agent {
docker { image 'maven:3-alpine' }
}
steps {
sh 'mvn --version'
}
}
These two options serve a bit different purpose. The tools block allows you to add specific versions of maven, jdk, or gradle in your PATH. You can't use any version - you can only use versions that are configured in the Global Tool Configuration Jenkins page:
If your Jenkins configuration contains only a single Maven version, e.g., Maven 3.6.3, you can use only this version. Specifying a version that is not configured in the Global Tool Configuration will cause your pipeline to fail.
pipeline {
agent any
tools {
maven 'Maven 3.6.3'
}
stages {
stage('Example') {
steps {
sh 'mvn --version'
}
}
}
}
Using the tools block to specify different versions of supported tools will be a good option if your Jenkins server does not support running docker containers.
The docker agent, on the other hand, gives you total freedom when it comes to specifying tools and their versions. It does not limit you to maven, jdk, and gradle, and it does not require any pre-configuration in your Jenkins server. The only tool you need is docker, and you are free to use any tool you need in your Jenkins pipeline.
pipeline {
agent {
docker {
image "maven:3.6.3-jdk-11-slim"
}
}
stages {
stage('Example') {
steps {
sh 'mvn --version'
}
}
}
}
When to use one over another?
There is no single right answer to this question. It depends on the context. The tools block is very limiting, but it gives you control over what tools are used in your Jenkins. In some cases, people decide not to use docker in their Jenkins environment, and they prefer to control what tools are available to their users. We can agree with this or not. When it comes to using the docker agent, you get full access to any tools that can be shipped as a docker container.
In some cases, this is the best choice when it comes to using a tool with a specific version - your operating system may not allow you to install the desired version. Of course, you need to keep in mind that this power and flexibility comes with a cost. You lose control over what tools are used in your Jenkins pipelines. Also, if you pull tons of different docker images, you will increase disk space consumption. Not to mention that the docker agent allows you to run the pipeline with tools that may consume lots of CPU and memory. (I have seen Jenkins pipelines starting Elasticsearch, Logstash, Zookeeper, and other services, on nodes that were not prepared for that load.)

No tool named SonarQube Scanner 2.8 found error

I followed these instructions to download the SonarQube Scanner plugin for Jenkins. I've configured these jenkins global settings for the SonarQube scanner correctly. The SonarQube server is setup and functioning properly.
https://docs.sonarqube.org/display/SCAN/Analyzing+with+SonarQube+Scanner+for+Jenkins#AnalyzingwithSonarQubeScannerforJenkins-AnalyzinginaJenkinspipeline
But when the build runs, it produces this error: No tool named SonarQube Scanner 2.8 found.
I am using a Jenkins declarative pipeline script for pipeline build.
I am using Jenkins ver. 2.131. I am using "SonarQube Scanner for Jenkins" Plugin version 2.8.1. I believe the Jenkins server is a common linux flavor. I am NOT using any version of Maven, and don't require it to build my projects.
I figured the plugin installed the actual scanner files for me on Jenkins. Do I need to have some version of the scanner command files installed, beyond whatever the plugin provided me? Meaning, is there something other than the plugin that I need to install on by Jenkins server? I would hope the SonarQube plugin would give me everything I would need to run on Jenkins build.
Here's the relavent part of my script:
stages {
stage("SonarQube Analysis") {
agent any
steps {
script {
def scannerHome = tool 'SonarQube Scanner 2.8';
withSonarQubeEnv("foo") {
sh "${scannerHome}/bin/sonar-scanner"
}
}
}
}
Here is a screenshot of the global configuration:
I think you didn't add Scanner in Jenkins Global Tool Configuration. You can do it by doing the following steps:
click Manage Jenkins
choose Global Tool Configuration
scroll to SonarQube Scanner
add SonarQube Scanner 2.8
May be would be actual for maven's users
stage("SonarQube analysis") {
steps {
script {
def scannerHome = tool 'SonarQube Scanner';
withSonarQubeEnv('SonarQube Server') {
sh 'mvn clean package sonar:sonar'
}
}
}
}

Gradle tool in Jenkins Declarative Pipeline

I defined a Jenkins Declarative pipeline to CI/CD my project. I am using gradle as my build tool. However I don't want to use the Gradle Wrapper and check it int the VCS. So I planed on using the jenkins tools functionality as below so that I can update the version number if I need to in future. But it doesn't seem to work.
pipeline {
agent any
tools {
gradle "gradle-4.0"
}
stage("Compile") {
steps {
sh 'gradle project/build.gradle classes'
}
}
I get the error "script.sh: gradle: not found".
I tried to echo PATH and that doesn't contain the path of this autoinstalled gradle tool. Please help.
Looks like there is an issue on the gradle plugin for Jenkins on plugin version 1.26. Please see the link to the bug reported below.
https://issues.jenkins-ci.org/browse/JENKINS-42381

Jenkins 2 pipeline deploying to udeploy

I am creating a CI/CD pipeline. I am trying to create a groovy function in order to deploy a build to udeploy.
I know I will need to pass the parameters used in to the function such as:
udeployServer,
component,
artifactDirectory,
version,
deployApplication,
environment and
deployProcess.
I was wondering has anyone tried to implement this or has anyone any idea how I should approach this?
Thanks
I don't know anything about udeploy servers but I do know there is no pipeline plugin for udeploy, which means that you will not have a function such as :
udeploy: server=yourserver component=yourcomponent artifactDirectory=...
However Jenkins allow you to use shell commands inside your groovy pipeline, so you should be able to do pretty much everything you need. So I guess the real question is how do you usually deploy a build to udeploy ? Do you do it via a REST API, do you push a file via FTP, ... ?
Jenkins build will be pretty straightforward, have a look at how to checkout and build using Jenkins pipeline.
An example pipeline could look like :
{
stage 'Build'
def mvnHome = tool 'M3'
sh "${mvnHome}/bin/mvn clean install"
//... Some other stages as needed...
stage 'Deploy'
sh "execute sh deploy script here..."
}
... where you deploy stage could use other plugins to copy files to your server, run REST API requests, etc. While writing a pipeline, have a look at Pipeline Syntax link for a Snippet Generator giving more detailed information about existing plugins.

Resources