I'm using a private Github repo with a Jenkinsfile to build a project. I'd actually like to do two separate builds, one for develop that builds whenever a branch is pushed, and one for qa that builds nightly. I've set up a Github Organization as this seems to be the only way to use credentials to check out the repository and perform the build.
My Jenkinsfile looks like:
node {
stage('Preparation') {
properties([[$class: 'ParametersDefinitionProperty',
parameterDefinitions: [
[$class: 'StringParameterDefinition', name: 'build_url'],
[$class: 'StringParameterDefinition', name: 'build_url2'],
]
]])
checkout scm
}
stage('Build') {
dir('Vecna_iDeliver_Torso') {
sh 'npm install'
sh 'node_modules/.bin/gulp build'
}
}
stage('Upload') {
sh 'aws s3 sync dist s3://app-dev'
}
stage('Cleanup') {
deleteDir()
}
}
This all works great, but I need to be able to set the environment variables (build urls) when running gulp, and their values will depend on the environment I want to build for. The s3 bucket I want to upload to will also depend on the environment.
When I set the properties above and then find the build job under my Github organization, I can see that it's accepting the build parameters. However, there doesn't seem to be any way for me to set these externally. I can only use them with "Build with Parameters." This would be fine if I wanted to run the build manually each time, but I want it to run nightly. Since the two different environments require different build parameter values, I can't set them as defaults.
Is there any way for me to set build parameter values ahead of time using the Jenkins pipeline?
Related
My project has 3 submodules in GitLab, which are all needed to build my project. I want to create independent pipelines in Jenkins to monitor and pull when a merge request is open.
If I create individual pipelines, Jenkins will create a new folder with the name of the pipeline project like so: "jenkins_home/workspace/submodule1", "jenkins_home/workspace/submodule2", "jenkins_home/workspace/submodule3".
Is it possible to specify the directory where I want to checkout each submodule? As in, checkout all into "jenkins_home/workspace/common_folder", where common_folder will contain submodule1, submodule2 and submodule3.
P.S. I tried bat 'cd common_folder', but the cd command just hangs and never executes.
Also tried dir (**subdir**){} which just creats a new directory inside the submodule pipeline directory: "jenkins_home/workspace/submodule1/subdir/code_from_git".
#!/usr/bin/env groovy
pipeline {
agent { label 'master' }
environment {
gbuild = 'true'
DB_ENGINE = 'sqlite'
}
options{
skipDefaultCheckout()
}
stages {
stage('Checkout') {
steps {
script {
checkout([
HERE, need to checkout into a custom folder and not the workspace
$class: 'GitSCM',
branches: scm.branches,
extensions: scm.extensions + [
[$class: 'GitLFSPull'],
[$class: 'CleanCheckout']
],
userRemoteConfigs: scm.userRemoteConfigs
])
}
}
}
I believe what you are doing dir is the correct approach or you can create separate pipelines.
Jenkins works on master slave configuration and the pipeline you create creates the same name folder in workspace on master server which is then created to slave server when you run the pipelines for the first time, once the pipeline runs and checkout the code on slave server it is then pushed to your master server.
I hope the answer explained you the working principle.
A possible workaround for projects with subprojects, where you want to track each subproject for any merge requests and need all the subprojects to build is: use an independent pipeline.
Additional comment: as there is no admin access on the server pc this might be limiting my capabilities to execute some simple commands, this solution might not be correct for you.
As my cmd commands in the pipeline were not executing and keeping the whole pipeline from running, and I was not able to change the location of the project from workspace to a desired location, I created 2 extra pipelines.
First pipeline is there to listen to webhooks from GitLab and pull the branch in the merge request (also verifying if its a merge request, if so its going to take the branch being merged, if not it will take the master branch):
stage('Checkout'){
steps{
script{
if(env.gitlabActionType == 'Merge')
{
checkout([
$class: 'GitSCM'
branches: [[name: "${env.gitlabSourceBranch}"]]
])
}
else
{
checkout([
$class: 'GitSCM'
branches: master
])
}
}
}
}
Second pipeline to copy the checkouted files into the desired location. For this step I made a freestyle project, where i execute a windows batch command to xcopy CheckedoutDir DesiredDestination.
The second pipeline has Build Trigger to build after first pipeline is built stable. It also has a Trigger/call builds on other projects to trigger the main pipeline that does the building and unit testing.
I have created a jenkins parameterized pipeline script as below. I have stored it on my Github repository.
properties([parameters([string(defaultValue: 'Devasish', description: 'Enter your name', name: 'Name'),
choice(choices: ['QA', 'Dev', 'UAT', 'PROD'], description: 'Where you want to deploy?', name: 'Environnment')])])
pipeline {
agent any
stages {
stage('one') {
steps {
echo "Hello ${Name} Your code is Building in ${Environnment} "
}
}
stage('Two') {
steps {
echo "Hello ${Name} hard testing in ${Environnment}"
}
}
stage('Three') {
steps {
echo "Hello1 ${Name} deploying in ${Environnment}"
}
}
}
}
Then, I have created a jenkins job by choosing pipeline option. While creating jenkins pipeline Under build triggers section, I have checked GitHub hook trigger for GITScm polling checkbox and Under Pipeline section, I have chosen Pipeline script from SCM followed by choosing Git in SCM, providing Repository URL where the above written JenkinsFile script is stored.
Then, Under Github repository settings, I have gone to webhooks and added one webhook where I specified my Payload URL as myJenkinsServerURL/github-webhook/. which will enable a functionality like whenever there will be any push event occurred within the repository, it will run the jenkins pipeline I created above.
Now, the situation is, when I am running this jenkins job from Classic UI by clicking Build with parameters, I am getting a text box to fill my name and a dropdown having list of 4 options ('QA', 'Dev', 'UAT', 'PROD') I gave above in script to choose, in which server I want to deploy my code, then it gets run.
But when I am committing in Github, it starts jenkins pipeline but not asking for parameters value instead just taking default value Devasish in name and QA in server.
What should I do to get an option of filling these details but not from Classic UI.
Thanks in advance.
As you have noted, when you trigger your pipeline manually, it will ask for the Build Parameters and let you specify values before proceeding.
However, when triggering the pipeline thru automatic triggers (e.g. SCM triggers/webhooks), then it is assumed to be an unattended build and it will use the defaultValue settings from your Jenkinsfile build parameters" definition.
Background
My team wants to update several linting rules in our project, however, doing so will cause our Jenkins build pipeline which lints, tests and builds each feature branch to break. We don't want to lose the value of linting each feature branch before merging, so we agree that linting only the files that the feature branch changes is a reasonable way to introduce these new lint rules without forcing us to re-lint the whole project up-front. Given that our entire project is already linted, this seems like a reasonable move.
A while ago I wrote a git tool to do exactly this. It determines which files have changed since the feature branch diverged from master and outputs those files so they can be consumed by eslint, pycodestyle and other linters. Here's the source if you're interested in how this is done.
Problem
Jenkins declarative build process and it's GitHub Branch Source Plugin seem to have a brittle checkout behavior that can't be modified to checkout more than just the feature branch that it's called on to build.
If I call git fetch origin stage within a build step, Jenkins complains about missing credentials. I don't feel comfortable sticking in credentials into my pipeline file, I'd MUCH prefer to continue using the Git plugin to manage credentials to our private github repo and pull branches, however, I'm at a loss as to how to specify for it to fetch more than just the feature branch.
For reference, here's the relevant portions of my Jenkinsfile
As you can see, I've tried adding the GitSCM code block to no avail. I've read this medium article which solves a similar problem, but I'm not using SSH credentials and I'd prefer not to given than we're already managing credentials using the Git plugin.
pipeline {
agent any
tools {
nodejs 'node12.7.0'
}
stages {
stage('checkout') {
steps {
checkout([
$class: 'GitSCM',
branches: [[name: '*']],
extensions: scm.extensions,
userRemoteConfigs: [],
doGenerateSubmoduleConfigurations: true
])
}
}
stage('install') {
steps {
script {
sh 'git config --add remote.origin.fetch +refs/heads/master:refs/remotes/origin/master'
sh 'yarn install'
}
}
}
stage('lint & test') {
failFast true
parallel {
stage('lint') {
when {
not {
anyOf {
branch 'stage'; branch 'int'; branch 'prod'
}
}
}
steps {
script {
sh """
git submodule update --init
yarn run lint
"""
}
}
}
...
}
}
stage('deploy') {...}
}
}
post {
failure {
notifySlack()
}
}
}
Create credential in your Jenkins with ssh key and private key, which can be added to the checkout userRemoteConfigs which will be used while checking out (Value given down just an example of one the credential id in my Jenkins environment)
userRemoteConfigs: [[credentialsId: '7969s7612-adruj-au2cd-492msa802f']]
One frequent root cause - mentioned on the referenced medium article, too - is that Jenkins only checks out the current branch that needs to be build.
An easy option I just found to have other project branches available is to
Configure your pipeline job
Under Behaviors->General, add Specify ref specs
Optionally adjust the parameter to the refs you need, e.g. the branches to compare to. Or you can get all branches by maintaining the default +refs/heads/*:refs/remotes/#{remote}/* as shown in the screenshot:
Jenkins Pipeline Job - Ref Config
P.S.: This seems to be part of the GIT Jenkins plugin, but I couldn't find it in the docs...
I have a scenario where in I have a frontend repository with multiple branches.
Here's my repo vs application structure.
I have a single Jenkinsfile like below:
parameters{
string(name: 'CUSTOMER_NAME', defaultValue: 'customer_1')
}
stages {
stage('Build') {
steps {
sh '''
yarn --mutex network
/usr/local/bin/grunt fetch_and_deploy:$CUSTOMER_NAME -ac test
/usr/local/bin/grunt collect_web'''
}
}
}
The above Jenkinsfile is same for all customers so I would like to understand what is the best way to have multiple customers build using a same Jenkinsfile and build different pipelines based on the parameter $CUSTOMER_NAME
I am not sure if I understood your problem. But I guess you could use a shared pipeline library: https://jenkins.io/doc/book/pipeline/shared-libraries/
You can put the build step in the library and call it with CUSTOMER_NAME as parameter.
(Please note: a shared pipeline library must be stored in a separate GIT repository!)
I've got a maven, java project and I'm using git.
I want to use jenkins for build + test + deploy (.war file) on tomcat server (on same device)
My current question is about triggering the build with pushing changes into the git repository master. However it did work with jenkins freestyle project. There I could setup my git repository, so it detected any changes and run the build.
But as far as I could make my research using a "pipeline" should be better to run the process with build + test + deploy. So I created a pipeline and also wrote a jenkinsfile.
pipeline {
agent any
stages {
stage('Compile Stage') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn clean compile'
}
}
}
stage('Testing Stage') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn test'
}
}
}
stage('Deployment Stage (WAR)') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn deploy'
}
}
}
}
}
The current problem is, that inside a pipeline project I could not find an option for setting up the git repository. Currently jenkins does not track any changes in git, when I push a change.
What I've to do, so jenkins runs build when changes are detected in git (like in the freestyle project)?
I thank you very much in advance.
Definition Inside the Repository (Jenkinsfile)
You should place the pipeline definition into a file called Jenkinsfile inside your repository.
This has the great advantage that your pipeline is also versioned. Using the Multibranch Project, you can point Jenkins to your Git repo and it will automatically discover all branches containing such Jenkinsfile (and create a job for each of them). You can find more information in the documentation.
In case you don't want jobs for different branches, you can also configure the job to take the pipeline definition from SCM:
With that specified, you can configure the job to poll SCM changes regularly:
Definition in the Job
In case you really don't want to put your pipeline into the repository (I don't recommend this), then you can use the checkout step to get your code:
pipeline {
agent any
stages {
stage('Compile Stage') {
steps {
checkout('https://git.example.com/repo.git')
withMaven(maven: 'maven_3_5_1') {
bat 'mvn clean compile'
}
}
}
// ...
More options for the checkout (e.g. other branches) can be found in the step documentation.
Finally, change the job to be built in regular intervals:
And now comes the point where I'm struggling (while editing the post): This probably builds the project every time (5min in the example). I am not sure, if currentBuild.changeSets contains the changes that are explicitly checked out with checkout. If it does, then you can check, if it contains changes and in such cases abort the build. All not very nice...