I have a jenkins pipeline job which bootstraps itself. The idea here is to self test changes, including changes to the jenkins pipeline groovy scripts, which are stored in git.
I was wondering how to bootstrap in a separate job than the build job. So for example, if I have a job called "build_linux", currently Jenkins will create a "workspace/build_linux" folder, and everything inside there will both bootstrap and build within there.
The issue here is that my build_repository.git is very large, and I don't want to check it out twice at the bootstrap stage and the build stage. However, if I don't check them both out in both stages, git (In actuality I'm repo syncing a manifest with both git repositories from the Google 'repo' scripts) complain that there's code in there which shouldn't be, and either fails or will try to clean it up (which is inefficient).
Ideally I'd just have both "workspace/bootstrap/" and "workspace/linux_build/" job folders.
For example, my pipeline looks like this:
bootstrap.gvy:
--> **entry point from my jenkins config**
stage('Bootstrap') {
node("$BOOTSTRAP_NODE") {
git checkout bootstrap_repository.git
git checkout build_repository.git // I don't want to do this here
pipeline = load "build/build.gvy"
}
pipeline.main(env.JOB_NAME) // which in this case is "build_linux"
}
build/build.gvy:
int main(String jobName) {
switch (jobname) {
case "build_linux":
doBuildLiunx()
break
}
}
def doBuildLinux() {
stage("Build") {
node("$BUILD_NODE") {
git checkout bootstrap_repository.git // I don't want to do this here
git checkout build_repository.git
}
}
}
Any suggestions for how to accomplish this? Perhaps I should use a different workspace for bootstrapping?
Thanks
Related
I'm very new to jenkins. I have a monorepo where I need to check what needs to be build. Created a pipeline element in Jenkins, named it Test and copy pasted a script to do a check what files changed:
def changedFiles = currentBuild.changeSets
.collect { it.getItems() }
.flatten() //Ensures that we look through each commit, not just the first.
.collect { it.getAffectedPaths() }
.flatten()
.toSet() //Ensures uniqueness.
echo("Changed files: ${changedFiles}")
def changedDirectories = changedFiles.collect {
if (it.contains('folder/in/monorepo/project/foo')) { //Code cut for simplicity
return "Foo"
}
return ''
}
.unique()
echo("Changed directories: ${changedDirectories}")
if (changedDirectories.contains("Foo")) {
stage('Foo Build') {
node {
env.NODEJS_HOME = "${tool 'NodeJS'}"
env.PATH="${env.NODEJS_HOME}/bin:${env.PATH}"
dir("folder/in/monorepo/buildscripts/") {
sh 'mybuildscript.sh'
}
}
}
} else {
echo("No Foo Build")
}
The code for loading NodeJS is documentend in the NodeJS plugin.
I installed Jenkins, tried some things and then installed NodeJS plugin and updates for other plugins.
Now, with the updated plugins there is some trouble which basically is related to this: Why does Jenkins creates a subfolder within the workspace#script folder to checkout git code instead of the workspace#script itself?
Instead of cloning into the workspace folder, the git repo gets cloned to folder with a unique id.
This leads to the fact, that my folder like in the command dir("folder/in/monorepo/buildscripts/") is not found or to be more precise: It's created and empty and not the one which is in the git repo because it is executed relative to my empty workspace instead relative to the cloned repo.
My questions are:
Is there a much easier way to achieve what I try? AFAIK I need to implement the search for the folder as in the referenced so question.
Doesn't this security fix from the referenced so question breaks all builds / jenkinsfiles? I mean: Eventually in a pipeline someone wants to call a build script, a build tool or a simple make with the makefile in the repo.
EDIT: I just realised that the only reason why the repo is cloned, is because I use the option "Pipeline script from SCM" with "Lightweight checkout" unchecked. I assume that the intended way would be to use a "Lightweight checkout" to just get the Jenkinsfile and then clone the repo as part of the script. A "Pipeline script" written in the gui, would also never get a clone of the repository.
I am currently a jenkins user and is exploring the Teamcity.
In jenkins we have a concept of shared libraries, which basically extends a generic groovy code into different jenkins pipeline and avoid re-writing the same functionality in each jenkins file following the DRY (don't repeat yourself) , hide implementation complexity, keep pipelines short and easier to understand
Example:
There could be a repository having all the Groovy functions like:
Repo: http:://github.com/DEVOPS/Utilities.git (repo Utilities)
Sample Groovy Scipt ==>> GitUtils.groovy with below functions
public void setGitConfig(String userName, String email) {
sh "git config --global user.name ${userName}"
sh "git config --global user.mail ${email}"
}
public void gitPush(StringbranchName) {
sh "git push origin ${branchName}"
}
In jenkinsfile we can just call this function like below (of course we need to define config in jenkins for it to know the Repo url for Shared library and give it a name):
Pipeline
//name of shared library given in jenkins
#Library('utilities') _
pipeline {
agent any
stages {
stage ('Example') {
steps {
// log.info 'Starting'
script {
def gitutil = new GitUtils()
gitutils.setGitConfig("Ray", "Ray#rayban.com")
}
}
}
}
}
And that's it anyone wanting the same function has to just include the library in jenkinsfile and use it in pipeline
Questions:
Can we migrate over the same to Teamcity, if yes how can it be done? We do not want to spend lot of time to re-writing
Jenkins also support Stashing and unstashing of workspace between stages, is the similar concept present in teamcity?
Example:
pipeline {
agent any
stages {
stage('Git checkout'){
steps {
stash includes: '/root/hello-world/*', name: 'mysrc'
}
}
stage('maven build'){
agent { label 'slave-1' }
steps {
unstash 'mysrc'
sh label: '', script: 'mvn clean package'
}
}
}
}
As for reusing common TeamCity Kotlin DSL libraries, this can be done via maven dependencies. For that you have to mention it in the pom.xml file within your DSL code. You can also consider using JitPack if your DSL library code is hosted on GitHub for example and you do not want to handle building it separately and publishing its maven artifacts.
Although with migration from Jenkins to TeamCity you will most likely have to rewrite the common library (if you still need one at all), as TeamCity project model and DSL are quite different to what you have in Jenkins.
Speaking of stashing/unstashing workspaces, it may be covered by either artifact rules and artifact dependencies (as described here: https://www.jetbrains.com/help/teamcity/artifact-dependencies.html) or repository clone mirroring on agents.
I have one git repository where I have multiple folders, I want the Jenkins pipeline trigger when a specific folder gets changed.
How can I achieve this?
You can configure your Jenkinsfile to do that.
Take a look on Jenkins Built-in Conditions. You need "changeset" here.
e.g.
stages {
stage('yourStage') {
when { changeset "FOLDERNAME/*"}
steps {
//.... what to do...
}
}
}
Jenkins Built In Conditions
I've got a maven, java project and I'm using git.
I want to use jenkins for build + test + deploy (.war file) on tomcat server (on same device)
My current question is about triggering the build with pushing changes into the git repository master. However it did work with jenkins freestyle project. There I could setup my git repository, so it detected any changes and run the build.
But as far as I could make my research using a "pipeline" should be better to run the process with build + test + deploy. So I created a pipeline and also wrote a jenkinsfile.
pipeline {
agent any
stages {
stage('Compile Stage') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn clean compile'
}
}
}
stage('Testing Stage') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn test'
}
}
}
stage('Deployment Stage (WAR)') {
steps {
withMaven(maven: 'maven_3_5_1'){
bat 'mvn deploy'
}
}
}
}
}
The current problem is, that inside a pipeline project I could not find an option for setting up the git repository. Currently jenkins does not track any changes in git, when I push a change.
What I've to do, so jenkins runs build when changes are detected in git (like in the freestyle project)?
I thank you very much in advance.
Definition Inside the Repository (Jenkinsfile)
You should place the pipeline definition into a file called Jenkinsfile inside your repository.
This has the great advantage that your pipeline is also versioned. Using the Multibranch Project, you can point Jenkins to your Git repo and it will automatically discover all branches containing such Jenkinsfile (and create a job for each of them). You can find more information in the documentation.
In case you don't want jobs for different branches, you can also configure the job to take the pipeline definition from SCM:
With that specified, you can configure the job to poll SCM changes regularly:
Definition in the Job
In case you really don't want to put your pipeline into the repository (I don't recommend this), then you can use the checkout step to get your code:
pipeline {
agent any
stages {
stage('Compile Stage') {
steps {
checkout('https://git.example.com/repo.git')
withMaven(maven: 'maven_3_5_1') {
bat 'mvn clean compile'
}
}
}
// ...
More options for the checkout (e.g. other branches) can be found in the step documentation.
Finally, change the job to be built in regular intervals:
And now comes the point where I'm struggling (while editing the post): This probably builds the project every time (5min in the example). I am not sure, if currentBuild.changeSets contains the changes that are explicitly checked out with checkout. If it does, then you can check, if it contains changes and in such cases abort the build. All not very nice...
Jenkins declarative pipelines offer a post directive to execute code after the stages have finished. Is there a similar thing to run code before the stages are running, and most importantly, before the SCM checkout?
For example something along the lines of:
pre {
always {
rm -rf ./*
}
}
This would then clean the workspace of my build before the source code is checked out.
pre is a cool feature idea, but doesn't exist yet. skipDefaultCheckout and checkout scm (which is the same as the default checkout) are the keys:
pipeline {
agent { label 'docker' }
options {
skipDefaultCheckout true
}
stages {
stage('clean_workspace_and_checkout_source') {
steps {
deleteDir()
checkout scm
}
}
stage('build') {
steps {
echo 'i build therefore i am'
}
}
}
}
For the moment there are no pre-build steps but for the purpose you are looking for, it can be done in the pipeline job configurarion and also multibranch pipeline jobs, when you define where is your jenkinsfile, choose Additional Behaviours -> Wipe out repository & force clone.
Delete the contents of the workspace before building, ensuring a fully fresh workspace.
If you do not really want to delete everything and save some network usage, you can just use this other option: Additional Behaviours -> Clean before checkout.
Clean up the workspace before every checkout by deleting all untracked files and directories, including those which are specified in .gitignore. It also resets all tracked files to their versioned state. This ensures that the workspace is in the same state as if you cloned and checked out in a brand-new empty directory, and ensures that your build is not affected by the files generated by the previous build.
This one will not delete the workspace but just reset the repository to the original state and pull new changes if there are some.
I use "Prepare an environment for the run / Script Content"