Currently working on a basic deployment pipeline in Jenkins (with pipeline). I am looking for the best way of doing the following:
When the developer pushes to the development branch, all stages but deploy is executed.
When the developer pushes to the master branch, all stages including deploy is executed.
I have read about matching patterns you can do, but not sure if this is the right way as the information I read was dated.
My Jenkins pipeline file
node {
stage('Preparation') {
git 'git#bitbucket.org:foo/bar.git'
}
stage('Build') {
sh 'mkdir -p app/cache app/logs web/media/cache web/uploads'
sh 'composer install'
}
stage('Test') {
sh 'codecept run'
}
stage('Deploy') {
sh 'mage deploy to:prod'
}
}
There's no magic here. This is just Groovy code. The branch in scope will be available as a parameter in some way. Inside the "stage" block, add an "if" check to compare the branch name with whatever logic you need, and either execute the body or not, depending on what branch is in scope.
Related
I have created an itiem using pipeline, and then in the pipeline selecting the pipeline script,
This allows me to run the build in stages. As below
[code]
pipeline {
agent any
tools {
terraform 'terraform-11'
}
stages {
stage('Git Checkout terraform') {
steps {
git credentialsId: '********', url: 'https://******/********.git'
}
}
stage('Terraform Init') {
steps {
sh 'terraform init'
}
}
stage('Terraform A'){
steps {
dir(dev){
sh 'terraform plan -var-file="terraform.tfvars"'
sh 'terraform apply -auto-approve'
}
}
}
stage('Terraform B'){
steps {
dir(env){
sh 'terraform plan -var-file="terraform.tfvars"'
sh 'terraform apply -auto-approve'
}
}
}
}
}
[/code]
This works very well, I take the code out and run a series of stages. There are more stages than this. What I would like to do is have the jenkins build run every time the terrform scripts are updated. I have look at examples but none of the examples are part of the PipeLine/PipeLine Script
There is Freestyle project, but it does not allow me to build all the stages I need.
There is PipeLine /Pipeline script from SCM which again does not allow me to build all the stages I need.
What I want to do is stick with my current pipeline, but set it so it can be run when scripts are pushed to Bitbucket. All I need is pointing at the right documentation. If this is possible. If its not possible, then I will go back to the drawing board.
I worked out the solution. I set up a Item that is a Folder, set up the Git Repo. Then I created a Jenkins file called JenkinsFile with all the stages and steps. This is then uploaded to the repo being built. So the build will run the main item which the pulls in the JenkinsFile and runs it.
My whole scripts are in one branch of repo and I have multiple jenkins pipeline job.
1. smoke
2. Regression
3. Epic wise Execution
each have a different pipeline script. So is it possible to have multiple jenkins file with custom name ?
pipeline {
node('Slave-Machine-1') {
env.NODE_HOME="${tool '8.9.4'}"
env.PATH="${env.NODE_HOME}/bin:${env.PATH}"
def AUTO = ''
stage("Install Dependency") {
sshagent(['agent-id']) {
sh 'npm install'
sh 'npm run webdriver-install'
}
}
stage("smoke") {
sh 'npm run smoke-test'
}
}
}
This is my sample pipeline script. similarly i have multiple pipeline scripts
You can name your pipeline scripts random_joe or anything you like as long as:
You do not use multibranch or organization pipeline projects, which specifically look for the filename Jenkinsfile to automatically create new jobs
You do not mind your text editor not syntax highlighting the pipeline scripts until you add the extension .groovy to them
It is advisable to follow conventions wherever not impracticable though.
I'm trying to get the following features to work in Jenkins' Declarative Pipeline syntax:
Conditional execution of certain stages only on the master branch
input to ask for user confirmation to deploy to a staging environment
While waiting for confirmation, it doesn't block an executor
Here's what I've ended up with:
pipeline {
agent none
stages {
stage('1. Compile') {
agent any
steps {
echo 'compile'
}
}
stage('2. Build & push Docker image') {
agent any
when {
branch 'master'
}
steps {
echo "build & push docker image"
}
}
stage('3. Deploy to stage') {
when {
branch 'master'
}
input {
message "Deploy to stage?"
ok "Deploy"
}
agent any
steps {
echo 'Deploy to stage'
}
}
}
}
The problem is that stage 2 needs the output from 1, but this is not available when it runs. If I replace the various agent directives with a global agent any, then the output is available, but the executor is blocked waiting for user input at stage 3. And if I try and combine 1 & 2 into a single stage, then I lose the ability to conditionally run some steps only on master.
Is there any way to achieve all the behaviour I'm looking for?
You need to use the stash command at the end of your first step and then unstash when you need the files
I think these are available in the snippet generator
As per the documentation
Saves a set of files for use later in the same build, generally on
another node/workspace. Stashed files are not otherwise available and
are generally discarded at the end of the build. Note that the stash
and unstash steps are designed for use with small files. For large
data transfers, use the External Workspace Manager plugin, or use an
external repository manager such as Nexus or Artifactory
I've got a maven, java project and I'm using git.
I want to use jenkins for build + test + deploy (.war file) on tomcat server (on same device)
My current question is about triggering the build with pushing changes into the git repository master. However it did work with jenkins freestyle project. There I could setup my git repository, so it detected any changes and run the build.
But as far as I could make my research using a "pipeline" should be better to run the process with build + test + deploy. So I created a pipeline and also wrote a jenkinsfile.
pipeline {
agent any
stages {
stage('Compile Stage') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn clean compile'
}
}
}
stage('Testing Stage') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn test'
}
}
}
stage('Deployment Stage (WAR)') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn deploy'
}
}
}
}
}
The current problem is, that inside a pipeline project I could not find an option for setting up the git repository. Currently jenkins does not track any changes in git, when I push a change.
What I've to do, so jenkins runs build when changes are detected in git (like in the freestyle project)?
I thank you very much in advance.
Definition Inside the Repository (Jenkinsfile)
You should place the pipeline definition into a file called Jenkinsfile inside your repository.
This has the great advantage that your pipeline is also versioned. Using the Multibranch Project, you can point Jenkins to your Git repo and it will automatically discover all branches containing such Jenkinsfile (and create a job for each of them). You can find more information in the documentation.
In case you don't want jobs for different branches, you can also configure the job to take the pipeline definition from SCM:
With that specified, you can configure the job to poll SCM changes regularly:
Definition in the Job
In case you really don't want to put your pipeline into the repository (I don't recommend this), then you can use the checkout step to get your code:
pipeline {
agent any
stages {
stage('Compile Stage') {
steps {
checkout('https://git.example.com/repo.git')
withMaven(maven: 'maven_3_5_1') {
bat 'mvn clean compile'
}
}
}
// ...
More options for the checkout (e.g. other branches) can be found in the step documentation.
Finally, change the job to be built in regular intervals:
And now comes the point where I'm struggling (while editing the post): This probably builds the project every time (5min in the example). I am not sure, if currentBuild.changeSets contains the changes that are explicitly checked out with checkout. If it does, then you can check, if it contains changes and in such cases abort the build. All not very nice...
I have an external tool that should be called as build-step in one of my jenkins jobs. Unfortunately, this tool has some issues with quoting commands to avoid problems with whitespaces in the path that is called from.
Jenkins is installed in C:\Program Files (x86)\Jenkins. Hence I'm having trouble with jenkins calling the external tool.
What I tried is to set "Workspace Root Directory" in Jenkins->configuration to C:\jenkins_workspace in order to avoid any whitespaces. This works for Freestyle Projects but my Multibranch Pipeline Project is still checked out and built under C:\Program Files (x86)\Jenkins\workspace.
One solution would be to move the whole jenkins installation to e.g. C:\jenkins. This I would like to avoid. Is there a proper way to just tell Jenkins Pipeline jobs to use the "Workspace Root Directory" as well?
Thanks for any help
the ws instruction sets the workspace for the commands inside it. for declarative pipelines, it's like this:
ws("C:\jenkins") {
echo "awesome commands here instead of echo"
}
You can also call a script to build the customWorkspace to use:
# if the current branch is master, this helpfully sets your workspace to /tmp/ma
partOfBranch = sh(returnStdout: true, script: 'echo $BRANCH_NAME | sed -e "s/ster//g"')
path = "/tmp/${partOfBranch}"
sh "mkdir ${path}"
ws(path) {
sh "pwd"
}
you can also set it globally by using the agent block (generally at the top of the pipeline block), by applying it to a node at that level:
pipeline {
agent {
node {
label 'my-defined-label'
customWorkspace '/some/other/path'
}
}
stages {
stage('Example Build') {
steps {
sh 'mvn -B clean verify'
}
}
}
}
Another node instruction later on might override it. Search for customWorkspace at https://jenkins.io/doc/book/pipeline/syntax/. You can also it use it with the docker and dockerfile instructions.
Try this syntax instead:
pipeline {
agent {
label {
label 'EB_TEST_SEL'
customWorkspace "/home/jenkins/workspace/ReleaseBuild/${BUILD_NUMBER}/"
}
}
}