Currently we are developing centralized control system for our CI/CD projects. There are many projects with many branches so we are using multibranch pipeline ( This forces us to use Jenkinsfile from project branches so we can't provide custom Jenkinsfile like Pipeline projects ). We want to control everything under 1 git repo where for every project there should be kubernetes YAMLS's, Dockerfile and Jenkinsfile. When developer presses build button, Jenkinsfile from their project repo suppose to run our jenkinsfile. Is it possible to do this?
E.g. :
pipeline {
agent any
stages {
stage('Retrieve Jenkinsfile From Repo') { // RETRIEVE JENKINSFILE FROM REPO
steps {
git branch: "master",
credentialsId: 'gitlab_credentials',
url: "jenkinsfile_repo"
scripts {
// RUN JENKINSFILE FROM THE REPO
}
}
}
}
}
Main reason we are doing this, there are sensetive context in jenkinsfile like production database connections. We don't want to store jenkinsfile under developers' repo. Also you can suggest correct way to achieve that beside using only 1 repo.
EDIT: https://plugins.jenkins.io/remote-file/
This plugin solved all my problems. I could'not try comments below
As an option you can use pipeline build step.
pipeline {
agent any
stages {
stage ('build another job') {
steps {
build 'second_job_name_here'
}
}
}
}
Try load step
scripts {
// rename Jenkinsfile to .groovy
sh 'mv Jenkinsfile Jenkins.groovy'
// RUN JENKINSFILE FROM THE REPO
load 'Jenkinsfile.groovy'
}
Related
I am currently a jenkins user and is exploring the Teamcity.
In jenkins we have a concept of shared libraries, which basically extends a generic groovy code into different jenkins pipeline and avoid re-writing the same functionality in each jenkins file following the DRY (don't repeat yourself) , hide implementation complexity, keep pipelines short and easier to understand
Example:
There could be a repository having all the Groovy functions like:
Repo: http:://github.com/DEVOPS/Utilities.git (repo Utilities)
Sample Groovy Scipt ==>> GitUtils.groovy with below functions
public void setGitConfig(String userName, String email) {
sh "git config --global user.name ${userName}"
sh "git config --global user.mail ${email}"
}
public void gitPush(StringbranchName) {
sh "git push origin ${branchName}"
}
In jenkinsfile we can just call this function like below (of course we need to define config in jenkins for it to know the Repo url for Shared library and give it a name):
Pipeline
//name of shared library given in jenkins
#Library('utilities') _
pipeline {
agent any
stages {
stage ('Example') {
steps {
// log.info 'Starting'
script {
def gitutil = new GitUtils()
gitutils.setGitConfig("Ray", "Ray#rayban.com")
}
}
}
}
}
And that's it anyone wanting the same function has to just include the library in jenkinsfile and use it in pipeline
Questions:
Can we migrate over the same to Teamcity, if yes how can it be done? We do not want to spend lot of time to re-writing
Jenkins also support Stashing and unstashing of workspace between stages, is the similar concept present in teamcity?
Example:
pipeline {
agent any
stages {
stage('Git checkout'){
steps {
stash includes: '/root/hello-world/*', name: 'mysrc'
}
}
stage('maven build'){
agent { label 'slave-1' }
steps {
unstash 'mysrc'
sh label: '', script: 'mvn clean package'
}
}
}
}
As for reusing common TeamCity Kotlin DSL libraries, this can be done via maven dependencies. For that you have to mention it in the pom.xml file within your DSL code. You can also consider using JitPack if your DSL library code is hosted on GitHub for example and you do not want to handle building it separately and publishing its maven artifacts.
Although with migration from Jenkins to TeamCity you will most likely have to rewrite the common library (if you still need one at all), as TeamCity project model and DSL are quite different to what you have in Jenkins.
Speaking of stashing/unstashing workspaces, it may be covered by either artifact rules and artifact dependencies (as described here: https://www.jetbrains.com/help/teamcity/artifact-dependencies.html) or repository clone mirroring on agents.
I am using Jenkins 2.89.2 and my project has Jenkinsfile which defines all build pipeline and steps. I have searched online for triggering build when push to github repo and all of them mentioned an options Build when a change was pushed to Github in jenkins configuration page. But in my jenkins configuration page I couldn't find this options. Below is an screenshot. It only has one options Periodically if not otherwise run. I have installed github related plugins but still I couldn't find that option. Is there any other configuration I can change?
Below is my Jenkinsfile:
pipeline {
agent {
label 'master'
}
tools {
maven 'maven-3.5.2'
jdk 'jdk9'
}
stages {
stage ('Checkout SCM') {
steps {
echo 'Checkout from Git...'
checkout scm
}
}
stage ('Build') {
steps {
echo 'Building '
}
}
}
}
After some searching I think I find the solution. The trigger option can be defined in jenkinsfile as below:
pipelineTriggers([
[$class: "GitHubPushTrigger"]
])
You basically have to create a HOOK from jenkins to gitHub, which is located under the Build Triggers option
This option should be available to you since you have the Git Plugin installed.
P.S. the solution you have mentioned will work only for Declarative Pipeline and will not work if you plan to switch to scripted pipeline
With Jenkins using the Declarative Pipeline Syntax how do i get the Dockerfile (Dockerfile.ci in this example) from the SCM (Git) since the agent block is executed before all the stages?
pipeline {
agent {
dockerfile {
filename 'Dockerfile.ci'
}
}
stage ('Checkout') {
steps {
git(
url: 'https://www.github.com/...',
credentialsId: 'CREDENTIALS',
branch: "develop"
)
}
}
[...]
}
In all the examples i've seen, the Dockerfile seems to be already present in the workspace.
You could try to declare agent for each stage separately, for checkout stage you could use some default agent and docker agent for others.
pipeline {
agent none
stage ('Checkout') {
agent any
steps {
git(
url: 'https://www.github.com/...',
credentialsId: 'CREDENTIALS',
branch: "develop"
)
}
}
stage ('Build') {
agent {
dockerfile {
filename 'Dockerfile.ci'
}
steps {
[...]
}
}
}
[...]
}
If you're using a multi-branch pipeline it automatically checks out your SCM before evaluating the agent. So in that case you can specify the agent from a file in the SCM.
The answer is in the Jenkins documentation on the Dockerfile parameter:
In order to use this option, the Jenkinsfile must be loaded from
either a Multibranch Pipeline or a Pipeline from SCM.
Just scroll down to the Dockerfile section, and it's documented there.
The obvious problem with this approach is that it impairs pipeline development. Now instead of testing code in a pipeline field on the server, it must be committed to the source repository for each testable change. NOTE also that the Jenkinsfile checkout cannot be sparse or lightweight as that will only pick up the script -- and not any accompanying Dockerfile to be built.
I can think of a couple ways to work around this.
Develop against agents in nodes with the reuseNode true directive. Then when code is stable, the separate agent blocks can be combined together at the top of the Jenkinsfile which must then be loaded from the SCM.
Develop using the dir() solution that specs the exact workspace directory, or alternately use one of the other examples in this solution.
I have a Multibranch Pipeline project which configures Jenkins Jobs based on a Jenkinsfile per branch. The sourcecode is hosted on a Github Enterprise Server.
When I view the configuration of a branch which is created by the Jenkinsfile, I noticed that there is a option GitHub project. This option allows to define the URL of the corresponding GitHub project.
I want to define this property via my Jenkinsfile in Pipeline syntax, but I don't know command to use and how.
Relevant parts of my Jenkinsfile:
pipeline {
agent {
docker {
image 'plinzen/android:latest'
label 'android'
}
}
triggers {
githubPush()
}
stages {
stage('build') {
steps {
checkout scm
sh './gradlew clean assembleDebug'
}
}
}
}
How can I define the GitHub project properties via my Jenkinsfile? I use the Jenkins GitHub Plugin in my project.
You can add a new agent node and add this code snippet to do your things. For more info you can refer to this url also. For Additional Info. Hope this helps.
git(
url: 'git#github.com<repo_name>.git',
credentialsId: 'xpc',
branch: '${branch}'
)
I've got a maven, java project and I'm using git.
I want to use jenkins for build + test + deploy (.war file) on tomcat server (on same device)
My current question is about triggering the build with pushing changes into the git repository master. However it did work with jenkins freestyle project. There I could setup my git repository, so it detected any changes and run the build.
But as far as I could make my research using a "pipeline" should be better to run the process with build + test + deploy. So I created a pipeline and also wrote a jenkinsfile.
pipeline {
agent any
stages {
stage('Compile Stage') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn clean compile'
}
}
}
stage('Testing Stage') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn test'
}
}
}
stage('Deployment Stage (WAR)') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn deploy'
}
}
}
}
}
The current problem is, that inside a pipeline project I could not find an option for setting up the git repository. Currently jenkins does not track any changes in git, when I push a change.
What I've to do, so jenkins runs build when changes are detected in git (like in the freestyle project)?
I thank you very much in advance.
Definition Inside the Repository (Jenkinsfile)
You should place the pipeline definition into a file called Jenkinsfile inside your repository.
This has the great advantage that your pipeline is also versioned. Using the Multibranch Project, you can point Jenkins to your Git repo and it will automatically discover all branches containing such Jenkinsfile (and create a job for each of them). You can find more information in the documentation.
In case you don't want jobs for different branches, you can also configure the job to take the pipeline definition from SCM:
With that specified, you can configure the job to poll SCM changes regularly:
Definition in the Job
In case you really don't want to put your pipeline into the repository (I don't recommend this), then you can use the checkout step to get your code:
pipeline {
agent any
stages {
stage('Compile Stage') {
steps {
checkout('https://git.example.com/repo.git')
withMaven(maven: 'maven_3_5_1') {
bat 'mvn clean compile'
}
}
}
// ...
More options for the checkout (e.g. other branches) can be found in the step documentation.
Finally, change the job to be built in regular intervals:
And now comes the point where I'm struggling (while editing the post): This probably builds the project every time (5min in the example). I am not sure, if currentBuild.changeSets contains the changes that are explicitly checked out with checkout. If it does, then you can check, if it contains changes and in such cases abort the build. All not very nice...