Jenkins how to use jenkins variables in sshCommand plug-in? - jenkins

I'm building war file by using maven and sending this build to another server via Jenkins ssh Publisher plugin.
stage('Deploy develop build to Test Server') {
when {
branch 'develop'
}
steps {
// sends war file to test_server
dir("${WORKSPACE}/ag.mycompany.feature/target/") {
sshPublisher(
alwaysPublishFromMaster: true,
continueOnError: false,
failOnError: true,
publishers: [
sshPublisherDesc(
configName: "test_server",
transfers: [sshTransfer(sourceFiles: "vc-admin_${VERSION}_${BUILD_NUMBER}_dev.war", remoteDirectory: '/wars/main_vc_admin/develop')],
verbose: true
)
]
)
}
}
}
this part working very well, I can see my war file with version and build number
then, I'm trying to cp these files (in remote server) to another folder and build a docker image via docker-compose.
connection to remote server via withCredentials is OK, and I tested with some sudo command like 'docker ps' and working well.
but I can't pass variables like VERSION and BUILD_NUMBER to sshCommand
is there any way to pass these variables?
And all examples in internet that I saw are in script{ ... } block for withCredentials and sshPublisher, that's why I used script block in declerative pipeline. I don't know how to create remote variable by using declerative pipeline.
stage("build docker containers") {
when {
branch 'develop'
}
steps {
script{
withCredentials([sshUserPrivateKey(credentialsId: 'test_server',
keyFileVariable: 'test_user',
passphraseVariable: '',
usernameVariable: 'test_user')]) {
// code block
def remote = [ : ]
remote.name = "MY_TEST_SERVER"
remote.host = "1.1.1.1"
remote.user = "user"
remote.password = "pass"
remote.allowAnyHosts = true
remote.identityFile = test_user
remote.pty = true
sshCommand remote: remote, command="cp /home/user/wars/main_vc_admin/develop/vc-admin_${VERSION}_${BUILD_NUMBER}_dev.war /home/user/main-vc-deployment/backend/vc-admin.war"
sshCommand remote: remote, sudo: true, command: "docker-compose --file /home/user/main-vc-deployment build && docker-compose --file /home/user/main-vc-deployment run -d"
}
}
}
}

Related

Jenkins Publish Over SSh pipeline parameterized

I am using Publish over SSH plugin in Jenkins to deploy the jar file which is created from the build process. I have created a pipeline like this
node {
{
stage('Checkout') {
git([url: '.............', branch: 'myBranch', credentialsId: 'mycredentials'])
}
stage('Build') {
script{
sh 'chmod a+x mvnw'
sh './mvnw clean package'
}
}
stage('Deploy to Server'){
def pom = readMavenPom file: 'pom.xml'
script {
sshPublisher(publishers: [sshPublisherDesc(configName: 'server-instance1',
transfers: [sshTransfer(cleanRemote:true,
sourceFiles: "target/${env.PROJECT_NAME}-${pom.version}.jar",
removePrefix: "target",
remoteDirectory: "${env.PROJECT_NAME}",
execCommand: "mv ${env.PROJECT_NAME}/${env.PROJECT_NAME}-${pom.version}.jar ${env.PROJECT_NAME}/${env.PROJECT_NAME}.jar"),
sshTransfer(
execCommand: "/etc/init.d/${env.PROJECT_NAME} restart -Dspring.profiles.active=${PROFILE}"
)
])
])
}
}
}
}
This works. I have a SSH Server configured under Manage Jenkins >> Configure System >> Publish Over SSH.
Now I want to deploy on multiple servers. Lets say I create multiple ssh configurations by name server-instance1, server-instance2. How do I make this Jenkins job parameterized ? I tried with checking the checkbox and selecting a Choice Parameter. But I am not able to figure out how to make the values for this dropdown come from the SSH server list(instead of hardcoding)
I tried few things as mentioned here(How to Control Parametrized publishing in Jenkins using Publish over SSH plugin's Label field). Unfortunately none of the articles talks about doing this from a pipeline.
Any help is much appreciated.
If you want to select the SSH server name dynamically, you can use the Extended Choice Parameter plugin which allows you to execute groovy code that will create the options for the parameter.
In the plugin you can use the following code to get the values:
import jenkins.model.*
def publish_ssh = Jenkins.instance.getDescriptor("jenkins.plugins.publish_over_ssh.BapSshPublisherPlugin")
configurations = publish_ssh.getHostConfigurations() // get all server configurations
return configurations.collect { it.name } // return the list of all servers
To configure this parameter in you pipeline you can use the following code for scripted pipeline:
properties([
parameters([
extendedChoice(name: 'SERVER', type: 'PT_SINGLE_SELECT', description: 'Server for publishing', visibleItemCount: 10,
groovyScript: '''
import jenkins.model.*
def publish_ssh = Jenkins.instance.getDescriptor("jenkins.plugins.publish_over_ssh.BapSshPublisherPlugin")
return publish_ssh.getHostConfigurations() .collect { it.name }
''')
])
])
Or the following code for declarative pipeline:
pipeline {
agent any
parameters {
extendedChoice(name: 'SERVER', type: 'PT_SINGLE_SELECT', description: 'Server for publishing', visibleItemCount: 10,
groovyScript: '''
import jenkins.model.*
def publish_ssh = Jenkins.instance.getDescriptor("jenkins.plugins.publish_over_ssh.BapSshPublisherPlugin")
return publish_ssh.getHostConfigurations() .collect { it.name }
''')
}
...
}
Once the parameter is defined just use it in your sshPublisher step:
sshPublisher(publishers: [sshPublisherDesc(configName: SERVER, transfers: ...
Another option you can have when using the Extended Choice Parameter is to configure it as Multi-Select instead of Single-Select so a user can then select multiple severs, and you can use the parallel option to publish over all selected servers in parallel.

How to transfer file from one server to another via SSH Steps Plugin

I'm new to Jenkins and I can't figure it out how to transfer file from one server to another via SSH Steps Plugin.
Can you show some pipeline example?
Would be very grateful for some help!
Already I have tried to use:
def remote = [:]
remote.name = "Nginx"
remote.host = "1.2.3.4"
remote.allowAnyHosts = true
node {
withCredentials([sshUserPrivateKey(credentialsId: 'Nginx_inst', keyFileVariable: 'identity', passphraseVariable: '', usernameVariable: 'myUser')]) {
remote.user = myUser
remote.identityFile = identity
stage("SSH Transfer") {
sshCommand remote: remote, command: "sudo cp /home/myUser/docs.zip /var/www/html"
// sshPut remote: remote, from: 'docs.zip', into: '/var/www/html/'
sshCommand remote: remote, command: "sudo unzip -tq /var/www/html/docs.zip"
}
}
}
### But I take an error:
Executing command on ****[1.2.3.4]: sudo cp /home/****/docs.zip /var/www/html sudo: false
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
com.jcraft.jsch.JSchException: USERAUTH fail
at com.jcraft.jsch.UserAuthPublicKey.start(UserAuthPublicKey.java:119)
at com.jcraft.jsch.Session.connect(Session.java:470)
..............
Finished: FAILURE
This is related to SSH keys. Your error is because of issues in keys used while doing SSH from Jenkins to target server.
Have a look at "com.jcraft.jsch.JSchException: Auth fail" with working passwords

Multiple ssh remotes in one jenkins pipeline

Could I use many remotes in one jenkins pipeline using SSH Pipeline Steps plugin?
Now my pipeline looks like this:
def remote = [:]
remote.name = 'PRE-DEV'
remote.host = 'x.x.x.x'
remote.user = 'jenkins'
remote.identityFile = '/var/lib/jenkins/.ssh/id_rsa'
remote.allowAnyHosts = true
remote.agentForwarding = true
pipeline {
agent any
stages{
stage('BUILD'){
steps{
sshCommand remote: remote, command: "build commands"
}
}
stage('UNIT TESTS'){
steps{
sshCommand remote: remote, command: "tests commands"
}
}
stage('DEPLOY TO DEV'){
steps{
sshCommand remote: remote, command: "scp artifacts push to other vm"
}
}
}
Now i need additional stage ('RUN ON DEV'), where i can run my artifacts on other VM. How can i do it in the same pipeline?
solution one:
you can just define another dict like blow:
def secondRemote = [:]
secondRemote.name = 'PRE-DEV'
secondRemote.host = 'your new host'
secondRemote.user = 'jenkins'
secondRemote.identityFile = '/var/lib/jenkins/.ssh/id_rsa'
secondRemote.allowAnyHosts = true
secondRemote.agentForwarding = true
then use it by
sshCommand remote: secondRemote, command: "your new command"
solution two:
store your private key in jenkins credentials, then using ssh-agent plugin.
https://www.jenkins.io/doc/pipeline/steps/ssh-agent/

How to setup sonar scanner in Jenkins Declarative Pipeline

I'm facing a problem in implementing SonarQube scanner for my repository in Jenkinsfile. I don't know where should I add the properties of SonarQube scanner in the Jenkinsfile.
I've set Jenkins locally on my windows system. The projects are purely based on Python, Ruby & React.
agent {label 'master'}
triggers {
GenricTrigger ([
genricVariables: [
key: 'pr_from_branch', value: '$.pullrequest.source.branch.name'],
[
expressionType: 'JsonPath',
regexpFilter: '',
defaultValue: ''],
token: 'test'])
}
options {
buildDiscarder (
logRotator(numToKeepStr:'5'))
}
stages {
stage ('Initialize & SonarQube Scan') {
steps {
def scannerHome = tool 'sonarScanner';
withSonarQubeEnv('My SonarQube Server') {
bat """
${scannerHome}/bin/sonar-runner.bat
pip install -r requirements.txt
"""
}
}
}
stage('Quality Gate') {
sleep time: 3000, unit: 'MILLISECONDS'
timeout(time: 1, unit: 'MINUTES') { // Just in case something goes wrong, pipeline will be killed after a timeout
def qg = waitForQualityGate() // Reuse taskId previously collected by withSonarQubeEnv
if (qg.status != 'OK') {
error "Pipeline aborted due to quality gate failure: ${qg.status}"
}
}
}
stage ('Smoke Test') {
steps {
bat """
pytest -s -v tests/home/login_test.py
currentBuild.result = 'SUCCESS'
"""
}
}
}
}
The properties include:
-----------------Sonarqube configuration........................
sonar.projectKey=<*****>
sonar.projectName=<project name>
sonar.projectVersion=1.0
sonar.login=<sonar-login-token>
sonar.sources=src
sonar.exclusions=**/*.doc,**/*.docx,**/*.ipch,/node_modules/,
sonar.host.url=http://<url>/
-----------------Sonar for bitbucket plugin configuration...................
sonar.bitbucket.repoSlug=<project name>
sonar.bitbucket.accountName=<name>
sonar.bitbucket.oauthClientKey=<OAuth_Key>
sonar.bitbucket.oauthClientSecret=<OAuth_secret>
sonar.analysis.mode=issues
I can manually add these properties in sonar-project.properties file and set this file in my project root directly but it will be running locally not on the server. So to avoid that I want to add these properties to Jenkinsfile
We run Sonar scanner as a Docker container but it should give you a fair idea of how to use your properties for the same in Jenkinsfile.
stage("Sonar Analysis"){
sh "docker pull docker.artifactory.company.com/util-sonar-runner:latest"
withSonarQubeEnv('sonarqube'){
sh "docker run --rm -v ${workspace}:/opt/spring-service -w /opt/spring-service -e SONAR_HOST_URL=${SONAR_HOST_URL} -e SONAR_AUTH_TOKEN=${SONAR_AUTH_TOKEN} docker.artifactory.company.com/util-sonar-runner:latest /opt/sonar-scanner/bin/sonar-scanner -Dsonar.host.url=${SONAR_HOST_URL} -Dsonar.login=${SONAR_AUTH_TOKEN} -Dsonar.projectKey=spring-service -Dsonar.projectName=spring-service -Dsonar.projectBaseDir=. -Dsonar.sources=./src -Dsonar.java.binaries=./build/classes -Dsonar.junit.reportPaths=./build/test-results/test -Dsonar.jacoco.reportPaths=./build/jacoco/test.exec -Dsonar.exclusions=src/test/java/**/* -Dsonar.fortify.reportPath=fortifyResults-${IMAGE_NAME}.fpr -Dsonar.password="
}
}
You run the pipeline step like this. The sonar server properties can be defined under the profile of the pom.xml file.
steps {
withSonarQubeEnv('SonarQube') {
sh 'mvn -Psonar -Dsonar.sourceEncoding=UTF-8 org.sonarsource.scanner.maven:sonar-maven-plugin:3.0.2:sonar'
}
}
The SonarQube scanner needs to be defined on Jenkins Global tool Configuration section.

Skip Jenkins pipeline stage on local dev environment condition

I have a Jenkins pipeline script that has a step which executes 2 linters(PyLint and Flake8) triggered by a pull request like this.
pipeline {
agent any
stages {
stage('PR Lint') {
when { branch "PR-*" }
steps {
parallel(
flake8: {
sh "mkdir flake8"
sh "git diff -U0 | tox -r -e flake8 -- --diff - --exit-zero --tee --output-file=flake8/flake8.txt"
archiveArtifacts allowEmptyArchive: true, artifacts: '**/flake8/*.txt'
step([
$class: 'ViolationsToGitHubRecorder',
config: violation_to_github_config
])
},
pylint: {
script{
if( readFile('tox.ini').contains('[testenv:pylint]')){
sh "mkdir pylint"
sh "tox -e pylint -- --errors-only --output-format=parseable > pylint/pylint.txt || true"
archiveArtifacts allowEmptyArchive: true, artifacts: '**/pylint/*.txt'
step([
$class: 'ViolationsToGitHubRecorder',
config: violation_to_github_config
])
}
}
})
I want to give developers the option to skip the execution of PyLint depending on any local configuration. As you can see I currently have a line if(readFile('tox.ini').contains('[testenv:pylint]')) but this change in the config would get pushed to the repository on merge and I don't want that.
Is there a workaround ?
I'd add another file, which would be on .gitignore, so it won't be checked in. Then you can simply extend your existing if to check if the file exists, or extend your when directive. E.g.:
when {
anyOf {
branch "PR-*"
expression { !fileExists("local") }
}
}

Resources