We are thinking to move our ci from jenkins to gitlab. We have several projects that have the same build workflow. Right now we use a shared library where the pipelines are defined and the jenkinsfile inside the project only calls a method defined in the shared library defining the actual pipeline. So changes only have to be made at a single point affecting several projects.
I am wondering if the same is possible with gitlab ci? As far as i have found out it is not possible to define the gitlab-ci.yml outside the repository. Is there another way to define a pipeline and share this config with several projects to simplify maintainance?
GitLab 11.7 introduces new include methods, such as include:file:
https://docs.gitlab.com/ee/ci/yaml/#includefile
include:
- project: 'my-group/my-project'
ref: master
file: '/templates/.gitlab-ci-template.yml'
This will allow you to create a new project on the same GitLab instance which contains a shared .gitlab-ci.yml.
First let me start by saying: Thank you for asking this question! It triggered me to search for a solution (again) after often wondering if this was even possible myself. We also have like 20 - 30 projects that are quite identical and have .gitlab-ci.yml files of about 400 - 500 loc that have to each be changed if one thing changes.
So I found a working solution:
Inspired by the Auto DevOps .gitlab-ci.yml template Gitlab itself created, and where they use one template job to define all functions used and call every before_script to load them, I came up with the following setup.
Multiple project repo's (project-1, project-2) requiring a shared set of CI jobs / functions
Functions script containing all shared functions in separate repo
Files
So using a shared ci jobs scipt:
#!/bin/bash
function list_files {
ls -lah
}
function current_job_info {
echo "Running job $CI_JOB_ID on runner $CI_RUNNER_ID ($CI_RUNNER_DESCRIPTION) for pipeline $CI_PIPELINE_ID"
}
A common and generic .gitlab-ci.yml:
image: ubuntu:latest
before_script:
# Install curl
- apt-get update -qqq && apt-get install -qqqy curl
# Get shared functions script
- curl -s -o functions.sh https://gitlab.com/giix/demo-shared-ci-functions/raw/master/functions.sh
# Set permissions
- chmod +x functions.sh
# Run script and load functions
- . ./functions.sh
job1:
script:
- current_job_info
- list_files
You could copy-paste your file from project-1 to project-2 and it would be using the same shared Gitlab CI functions.
These examples are pretty verbose for example purposes, optimize them any way you like.
Lessons learned
So after applying the construction above on a large scale (40+ projects) I want to share some lessons learned so you don't have to find out the hard way:
Version (tag / release) your shared ci functions script. Changing one thing can now make all pipelines fail.
Using different Docker images could cause an issue in the requirement for bash to load the functions (e.g. I use some Alpine-based images for CLI tool based jobs that have sh by default)
Use project based CI/CD secret variables to personalize build jobs for projects. Like environment URL's etc.
Since gitlab version 12.6, it's possible define a external .gitlab-cy.yml file.
To customize the path:
Go to the project's Settings > CI / CD.
Expand the General pipelines section.
Provide a value in the Custom CI configuration path field.
Click Save changes.
...
If the CI configuration will be hosted on an external site, the URL link must end with .yml:
http://example.com/generate/ci/config.yml
If the CI configuration will be hosted in a different project within
GitLab, the path must be relative to the root directory in the other
project, with the group and project name added to the end:
.gitlab-ci.yml#mygroup/another-project
my/path/.my-custom-file.yml#mygroup/another-project
Use include feature, (available from GitLab 10.6):
https://docs.gitlab.com/ee/ci/yaml/#include
So, i always wanted to post, with what i came up with now:
Right now we use a mixed approach of #stefan-van-gastel's idea of a shared ci library and the relatively new include feature of gitlab 11.7. We are very satisfied with this approach as we can now manage our build pipeline for 40+ repositories in a single repository.
I have created a repository called ci_shared_library containing
a shell script for every single build job containing the execution logic for the step.
a pipeline.yml file containing the whole pipeline config. In the before script we load the ci_shared_library to /tmp/shared to be able to execute the scripts.
stages:
- test
- build
- deploy
- validate
services:
- docker:dind
before_script:
# Clear existing shared library
- rm -rf /tmp/shared
# Get shared library
- git clone https://oauth2:${GITLAB_TOKEN}#${SHARED_LIBRARY} /tmp/shared
- cd /tmp/shared && git checkout master && cd $CI_PROJECT_DIR
# Set permissions
- chmod -R +x /tmp/shared
# open access to registry
- docker login -u gitlab-ci-token -p $CI_JOB_TOKEN $CI_REGISTRY
test:
stage: test
script:
- /tmp/shared/test.sh
build:
stage: build
script:
- /tmp/shared/build.sh
artifacts:
paths:
- $CI_PROJECT_DIR/target/RPMS/x86_64/*.rpm
expire_in: 3h
only:
- develop
- /release/.*/
deploy:
stage: deploy
script:
- /tmp/shared/deploy.sh
artifacts:
paths:
- $CI_PROJECT_DIR/tmp/*
expire_in: 12h
only:
- develop
- /release/.*/
validate:
stage: validate
script:
- /tmp/shared/validate.sh
only:
- develop
- /release\/.*/
Every project that want's to use this pipeline config has to have a .gitlab-ci.yml. In this file the only thing to do is to import the shared pipeline.yml file from the ci_shared_library repo.
# .gitlab-ci.yml
include:
- project: 'ci_shared_library'
ref: master
file: 'pipeline.yml'
With this approach really everything regarding to the pipeline lives in one single repository and is reusable. We have the whole pipeline-template in one file, but i think it would even be possible to split this up to have every single job in a yml-file. This way it would be more flexible and one could create default jobs that can be merged together differently for projects that have similar jobs but not every project needing all jobs...
With GitLab 13.5 (October 2020), the include feature is even more useful:
Validate expanded GitLab CI/CD configuration with the API
Writing and debugging complex pipelines is not a trivial task. You can use the include keyword to help reduce the length of your pipeline configuration files.
However, if you wanted to validate your entire pipeline via the API previously, you had to validate each included configuration file separately which was complicated and time consuming.
Now you have the ability to validate a fully-expanded version of your pipeline configuration through the API, with all the include configuration included.
Debugging large configurations is now easier and more efficient.
See Documentation and Issue.
And:
See GitLab 13.6 (November 2020)
Include multiple CI/CD configuration files as a list
Previously, when adding multiple files to your CI/CD configuration using the include:file syntax, you had to specify the project and ref for each file. In this release, you now have the ability to specify the project, ref, and provide a list of files all at once. This prevents you from having to repeat yourself and makes your pipeline configuration less verbose.
See Documentation) and Issue.
You could look into the concept of Dynamic Child pipeline.
It has evolved with GitLab 13.2 (July 2020):
Dynamically generate Child Pipeline configurations with Jsonnet
We released Dynamic Child Pipelines back in GitLab 12.9, which allow you to generate an entire .gitlab-ci.yml file at runtime.
This is a great solution for monorepos, for example, when you want runtime behavior to be even more dynamic.
We’ve now made it even easier to create CI/CD YAML at runtime by including a project template that demonstrates how to use Jsonnet to generate the YAML.
Jsonnet is a data templating language that provides functions, variables, loops, and conditionals that allow for fully parameterized YAML configuration.
See documentation and issue.
Is it possible to build a project inside sub-directory of Bitbucket repository in Jenkins? I have been trying and testing Jenkins build from sub-directory of Bitbucket respository instead building everything from master branch because I do not want to put load on servers just for the testing purpose. I would like to know if it is possible and how it can be done. For instance, I would like to build only a project from sub directory 2 of Bitbucket repository as shown below. Please help with the diagram if possible. Thanks.
*/master
- Directory 1
- sub directory 1
- pom.xml
- **sub directory 2
- pom.xml**
- Directory 2
- sub directory A
- subdirectory B
- pom.xml
If anyone is still looking for an answer, you need to provide the path of pom.xml in the Root POM of Build section of Configure.
Go to Configure -> Scroll down to Build -> in Root POM enter subdirectory name/pom.xml
[image reference]1
e.g. If your pom.xml is present at Repo/subdirectory/pom.xml you need to enter subdirectory/pom.xml in Root POM
I have a number of Gradle builds that work very well from the command line, from buildship, etc.
However now I am porting them to a Jenkins system. And it is producing some very strange results. I'm pretty much a total newbie to Jenkins, so this may have an easy answer. So far I haven't found it.
I am using the Gradle Plugin for Jenkins, v.1.24 to configure my build in Jenkins. However, Jenkins (at least as I have it configured) organizes its build structure as {jenkins root}/data/jobs/{project_name}/workspace. When code is checked out of source control it is deposited in that directory, not in a directory named {project_name}.
Gradle seems to assume that the directory in which it is running names the project, and when I'm running outside of Jenkins this assumption is true. The name of the project that Gradle sees is the name of the project that was checked out from source control. Project.name is a gettable but not a settable property of a gradle Project. So in the Jenkins case, the archives that gradle builds are named workspace* rather than {project_name}*. It is also named workspace in the repositories it publishes into. I must be missing something very obvious but for the life of me I cannot figure out what it is.
Has anyone grappled with this?
UPDATE - It appears that the problem is that the people who designed my Jenkins instance knew nothing about Gradle. The {jenkins root}/data/jobs/{project_name}/workspace layout that I described above is not required by Jenkins, but apparently was felt to be useful for some reason in some other, non-Gradle context. So the question becomes, where is the project layout set up in the Jenkins configuration - OR - can Gradle be modified somehow to assume a different project layout/naming strategy.
Set Manage Jenkins → Configure System → Advanced... (the one right at the top) → Workspace Root Directory: ${JENKINS_HOME}/workspace/${ITEM_FULLNAME}.
The inline help:
Specify where Jenkins would store job workspaces on the master node. (It has no effect on builds run on slaves.) This value can include the following variables.
${JENKINS_HOME} — Jenkins home directory.
${ITEM_ROOTDIR} — Root directory of a job for which the default workspace is allocated.
${ITEM_FULL_NAME} — '/'-separated job name, like "foo/bar".
Changing this value allows you to put workspaces on SSD, SCSI, or even ram disks. Default value is ${ITEM_ROOTDIR}/workspace.
.../jenkins/config.xml
...
<workspaceDir>${JENKINS_HOME}/workspace/${ITEM_FULLNAME}</workspaceDir>
...
Gradle seems to assume that the directory in which it is running names the project
Yes this is gradle's default behavior, but can be easily overridden. If it is just the output artifact name you're concerned about, override the jar name with:
jar{
baseName 'actualProjectName'
}
Iam trying to run Heat.exe on TFS build - execution is fine althouth - heat gave me error message:
'C:\PathToplace\Development\Install\WixInstaller\SKRwixInstaller\HarvestedFiles' did not contain any files or sub-directories and since empty directories are not being kept, there was nothing to harvest.
Clear message - before heat i do robocopy task - which should have copied all needed files for harvesting - works fine on local, but on TFS there is no file and any dir, just empty directory i made as well...found this a bit weird...output files are presented in place where they supposed to be - my output location is "as configured". Any idea?
I have the following problem on my Jenkins with Perforce setup:
The original branch content like described below, because all 3 modules are needed to build product. So a build workspace would contain all 3 modules.
//depot-product/moduleA
/moduleB
/moduleC
I've created a mapping to look only any changes on moduleA. So my mapping looks like this:
//depot-product/moduleA/... //workspace/moduleA/...
However, what I didn't expect is that on moduleA's code there are symlinks to directories on other modules. So p4 fails with this message below:
Caught exception communicating with perforce. Errors encountered while force syncing: error: open for write: /home/user/jenkins/jobs/jobModuleA/workspace/moduleA/dir1/symDir/file.txt: No such file or directory
symDir is a link to a file on moduleB.
So my question is, can I tell p4 sync to ignore the files that are outside this worskpace, or directories pointed to by symlinks ?