There is a Job of type Pipeline on Jenkins.
Gradle plugin is installed.
I try to run gradle tasks directly but without success.
Current pipeline script:
node {
stage('Compile') {
// First variant
gradle {
tasks: 'clean'
tasks: 'compileJava'
}
// Second variant
gradle tasks: 'clean'
gradle tasks: 'compileJava'
// Third variant
gradle('clean')
gradle('compileJava')
}
This script doesn't fail but does just nothing.
Output to the console:
[Pipeline] stage
[Pipeline] { (Compile)
[Pipeline] }
[Pipeline] // stage
How can I run gradle directly from the pipeline script?
Related
In Jenkins > Global Tool Configuration > JDK installation > I have added JDK7 and its name is oracle-7u80; Similarly under Maven installation, I have added Maven 3.5 install and named it mvn.
Now I am using the above two installs in the Pipeline script:
pipeline {
agent any
tools {
maven 'mvn'
jdk 'oracle-7u80'
}
stages {
stage('Example') {
steps {
}
}
}
}
I do not want to hard code the jdk and Maven values in the Tools section in the pipeline. I want to pass these values via environment variables or properties so that I can manage them externally.
Is there a way to pass the values (mvn or oracle-7u80) that is defined to Maven and jdk in the tools using environment variables?
Like if I need to inject a value within Steps/Script section, in Jenkins pipeline, I can define globally in the environment variables or using Jenkins project
Configure
General
Check mark Prepare an environment for the run
Check mark Keep Jenkins environment variables
I can provide the environment variable in the properties content with Properties File definition.
My intention is to get a format like this:
pipeline {
agent any
tools {
maven '${MVN_VERSION}'
jdk '${ORACLE_VERSION'}
}
stages {
stage('Example') {
steps {
}
}
}
}
Pipeline projects are often used with a Jenkinsfile (Pipeline script from SCM in the Pipeline → Definition drop-down list) to bind a source code version and its build configuration to each other for reproducable builds.
Injecting build tool versions from external before the build contradicts this idea.
I'm also not sure whether this is even possible conceptually since (environment) variables' values from external are set in stages ... script which is a totally different declaration branch than tools. But hey, it's called declarative pipeline, not imperative, so order shouldn't matter ... in theory. I'll give it a try.
For passing external values into internal variables in general see Pipeline: Nodes and Processes, sh: Shell Script and also the answer to the question How to access Shell variable value into Groovy pipeline script.
Maven version injection try
pipeline {
agent any
tools {
maven "${MVN_VERSION}"
}
stages {
stage('Try: Maven version injected') {
steps {
script {
env.MVN_VERSION = sh script: 'echo "Maven 3.8.1"', returnStdout: true
}
echo "${MVN_VERSION}"
}
}
}
}
As expected:
[Pipeline] stage
[Pipeline] { (Declarative: Tool Install)
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: MVN_VERSION for class: groovy.lang.Binding
...
Another idea that came into my mind is to make this project parameterized with two parameters (e.g. MVN_GLOBAL_TOOL_NAME, JDK_GLOBAL_TOOL_NAME) via Choice parameter s, for instance, and this works:
pipeline {
agent any
tools {
maven "${MVN_GLOBAL_TOOL_NAME}" // coming from parameterized project's build parameter
}
stages {
stage('Maven tool as build parameter') {
steps {
echo "MVN_GLOBAL_TOOL_NAME=${MVN_GLOBAL_TOOL_NAME}"
}
}
}
}
Console Outpout
[Pipeline] stage
[Pipeline] { (Declarative: Tool Install)
[Pipeline] tool
[Pipeline] envVarsForTool
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Maven version as build parameter)
[Pipeline] tool
[Pipeline] envVarsForTool
[Pipeline] withEnv
[Pipeline] {
[Pipeline] script
[Pipeline] {
[Pipeline] echo
MVN_GLOBAL_TOOL_NAME=Maven 3.8.1
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
See also ${JENKINS_URL}/job/${JOB_NAME}/api/:
Perform a build
If the build has parameters, post to this URL [Link note: ${JENKINS_URL}/job/${JOB_NAME}/buildWithParameters] and provide the parameters as form data.
See also: ${JENKINS_URL}/env-vars.html/.
I want to use jenkins pipeline and sonarqube to analysis code. Both jenkins and sonarqube are running in docker.I followed manual:https://docs.sonarqube.org/latest/analysis/scan/sonarscanner-for-jenkins to create a pipeline job:
node {
stage('SCM') {
checkout(XXXX)
}
stage('Build + SonarQube analysis') {
def sqScannerMsBuildHome = tool 'msbuild'
withSonarQubeEnv('sonar') {
sh "mono ${sqScannerMsBuildHome}/SonarScanner.MSBuild.exe begin /k:myKey"
sh 'mono MSBuild.exe /t:Rebuild'
sh "mono ${sqScannerMsBuildHome}/SonarScanner.MSBuild.exe end"
}
}
}
when I built this pipeline,I got an error like this:
Cannot open assembly 'MSBuild.exe': No such file or directory.
Here is the build log:
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Build + SonarQube analysis)
[Pipeline] tool
[Pipeline] withSonarQubeEnv
Injecting SonarQube environment variables using the configuration: sonar
[Pipeline] {
[Pipeline] sh
+ mono /var/jenkins_home/tools/hudson.plugins.sonar.MsBuildSQRunnerInstallation/msbuild/SonarScanner.MSBuild.exe begin /k:myKey
SonarScanner for MSBuild 5.0.4
Using the .NET Framework version of the Scanner for MSBuild
Pre-processing started.
Preparing working directories...
07:36:24.082 Updating build integration targets...
07:36:24.993 Fetching analysis configuration settings...
07:36:25.387 Provisioning analyzer assemblies for cs...
07:36:25.388 Installing required Roslyn analyzers...
07:36:25.592 Provisioning analyzer assemblies for vbnet...
07:36:25.592 Installing required Roslyn analyzers...
07:36:25.655 Pre-processing succeeded.
[Pipeline] sh
+ mono MSBuild.exe /t:Rebuild
Cannot open assembly 'MSBuild.exe': No such file or directory.
[Pipeline] }
WARN: Unable to locate 'report-task.txt' in the workspace. Did the SonarScanner succeeded?
[Pipeline] // withSonarQubeEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE
I checked /var/jenkins_home/tools/hudson.plugins.sonar.MsBuildSQRunnerInstallation/msbuild directory and can't find MSBuild.exe file.
[file in directory][1]
[1]: https://i.stack.imgur.com/2UcdK.png
How can I solve this problem or How can I use jenkins and sonarqube to analysis C# project
i have an sbt project in jenkins, created the project as jenkins pipeline project, i have installed sbt in jenkins, i have checked the auto install checkbox and select the version number 1.2.8
here is my jenkins file
pipeline {
agent any
stages {
stage('Reload') {
steps {
echo "Reloading..."
//sh "sbt reload"
sh "${tool name: 'sbt1.2.8', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt compile"
}
}
}
}
and here is sbt settings in jenkins
here is jenkins console logs
+ /var/lib/jenkins/tools/org.jvnet.hudson.plugins.SbtPluginBuilder_SbtInstallation/sbt1.2.8/bin/sbt compile
[0m[[0m[0minfo[0m] [0m[0mLoading settings for project interpret-backend-project-jenkinsfile-build from plugins.sbt ...[0m
[0m[[0m[0minfo[0m] [0m[0mLoading project definition from /var/lib/jenkins/workspace/interpret-backend-project-jenkinsfile/project[0m
[0m[[0m[0minfo[0m] [0m[0mLoading settings for project interpret-backend-project-jenkinsfile from build.sbt ...[0m
[0m[[0m[0minfo[0m] [0m[0mSet current project to interpret (in build file:/var/lib/jenkins/workspace/interpret-backend-project-jenkinsfile/)[0m
[0m[[0m[0minfo[0m] [0m[0mExecuting in batch mode. For better performance use sbt's shell[0m
[0m[[0m[32msuccess[0m] [0m[0mTotal time: 4 s, completed Nov 25, 2020, 4:53:05 PM[0m
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
[Checks API] No suitable checks publisher found.
Finished: SUCCESS
[0m[[0m[0minfo[0m] [0m[0m why is this displaying how can i fix this ?
These are ANSI color codes. You can either embrace them to have colorful logs or disable them.
To enable colors in Jenkins you can modify your pipeline definition:
pipeline {
...
options {
ansiColor('xterm')
}
...
}
This is a reference to the AnsiColor plugin.
If you can't use this plugin and want to disable colors in sbt logs, you can do that by modifying sbt.color option. For example, by launching sbt with -Dsbt.color=false (I see that you can add this to in the UI) or by adding it to the SBT_OPTS environment variable:
pipeline{
...
environment {
SBT_OPTS = "${SBT_OPTS} -Dsbt.color=false"
}
...
}
Check out sbt docs and take a look at the sbt.ci option as well, it should be automatically set on Jenkins.
I have a Jenkins pipeline that periodically pull from gitlab and build different repos, build a multi-component platform, run and test it. Now I installed a sonarqube server on the same machine (Ubuntu 18.04) and I want to connect my Jenkins to sonarqube.
In Jenkins:
I set up the sonarqube scanner at Global Tool Configuration as below:
I generated a token in sonarqube and in Jenkins at configuration I set up the server as below BUT I couldn't find any place to insert the token (and I think this is the problem):
In the jenkins pipeline this is how I added a stage for sonarqube:
stage('SonarQube analysis') {
steps{
script {
scannerHome = tool 'SonarQube';
}
withSonarQubeEnv('SonarQube') {
sh "${scannerHome}/bin/sonar-scanner"
}
}
}
But this fails with below logs and ERROR: script returned exit code 127:
[Pipeline] { (SonarQube analysis)
[Pipeline] script
[Pipeline] {
[Pipeline] tool
Invalid tool ID
[Pipeline] }
[Pipeline] // script
[Pipeline] withSonarQubeEnv
Injecting SonarQube environment variables using the configuration: SonarQube
[Pipeline] {
[Pipeline] sh
+ /var/lib/jenkins/tools/hudson.plugins.sonar.SonarRunnerInstallation/SonarQube/bin/sonar-scanner
/var/lib/jenkins/workspace/wws-full-test#tmp/durable-2c68acd1/script.sh: 1: /var/lib/jenkins/workspace/wws-full-test#tmp/durable-2c68acd1/script.sh: /var/lib/jenkins/tools/hudson.plugins.sonar.SonarRunnerInstallation/SonarQube/bin/sonar-scanner: not found
[Pipeline] }
WARN: Unable to locate 'report-task.txt' in the workspace. Did the SonarScanner succeeded?
[Pipeline] // withSonarQubeEnv
[Pipeline] }
[Pipeline] // stage
And when I check my jenkinstools on the disk sonnar plugin is not there:
$ ls /var/lib/jenkins/tools/
jenkins.plugins.nodejs.tools.NodeJSInstallation
Can someone please let me know how I can connect Jenkins to sonarqube?
Create and add token to be able to connect to SonarQube.
You have create project in SonarQube and use it as a parameter:
sh """
${scannerHome}/bin/sonar-scanner \
-Dsonar.projectKey=your_project_key_created_in_sonarqube_as_project \
-Dsonar.sources=. \
"""
I am trying to build a job by pipeline to my other slave in the master
the pipeline is like this
pipeline {
agent {
label "virtual"
}
stages {
stage("test one") {
steps {
echo " test test test"
}
}
stage("test two") {
steps {
echo " testttttttttt "
}
}
}
}
they syntax not getting the error but it doesn't build on my slave server,
but when I run on freestyle job by Restrict where this project can be run with that label then execute sheel by echo "test test"
it was built on my slave server,
what is wrong with my pipeline ? do I missing something?
after build
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on virtual in /home/virtual/jenkins/workspace/demoo
[Pipeline] {
[Pipeline] stage
[Pipeline] { (test one)
[Pipeline] echo
test test test
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (test two)
[Pipeline] echo
testttttttttt
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Add the path you want in the Remote root directory (yellow column) as shown below:-
The build works like you did it already. The steps will be executed on the slave. If you add something like clone a repository to your step, your workspace directory will be created.
Pipeline and Freestylejobs are working here different. The Freestylejob will make the directory in workspace as soon as it runs at the first time. The Pipelinejob will create the directory as soon as it needs this this directory.
I created a simple Pipeline:
pipeline {
agent {
label "linux"
}
stages {
stage("test one") {
steps {
sh "echo 'test test test' > text.txt"
}
}
}
}
I converted your echo to a sh command because my Slave is a linux slave. The sh step creates a text.txt file. As soon as I run this job, the directory will be created:
[<user>#<server> test-pipeline]$ pwd
/var/lib/jenkins/workspace/test-pipeline
[<user>#<server> test-pipeline]$ ls -l
total 4
-rw-r----- 1 <user> <group> 15 Oct 7 16:49 text.txt