I want to use a config file provided by config file provider plugin in a pipeline project.
However when I run a build step inside a slave. I get a "PermissionDenied" exception, The same runs in master however.
So question is thats the best possible way to share files between master and slaves. I may not be able to Copy to slave plugin as there doesn't seem to be pipeline support.
If you want to share files between stages or nodes you can use the stash - unstash methods. see the example here
If you want to share files between builds you can use the archive method and the Copy Artifact Plugin
Related
I am building a Shared Library(I'm not a CI engineer) for Jenkins pipeline. As a part of it, we are thinking of giving configurations in yaml file and have Jenkinsfile pipeline script read from the yaml file.
So, we are planning to commit Jenkinsfile and yaml file in one git repository (let's say repo A) and the job is going to run on a slave machine utilising another git repository (let's say repo B). The Jenkinsfile will be executed from master after it clones repo A. The yaml file is also in master workspace. In the slave, repo B will be cloned and a build will take place as defined in the Jenkinsfile. But the question I have is, how do I read yaml file from Jenkinsfile script without having to clone repo A in slave i.e., how I reference a file present in master and not in slave from Jenkinsfile? This question arises because, whatever file I am trying to open is being opened from slave and not from master.
Thanks in advance.
EDIT:
Soemthing I forgot to mention. We actually thought of using stash from master and unstash in slave. But the problem is that we do only the job configurations and the environment is provided to us by some other team which also provides to many other teams. Because of that, master rarely has executors free. So, even when we run the job, it hangs waiting for the master to be free. Is there any other way to load the yaml file when the pipeline script is being loaded in master's memory?
I think the most straightforward way is to use the stash. Stashes are temporary packages that can be created on one node and copied to any other node.
There is some overhead on master for the package creation so stashes are not recommended for really big files, but they are ideally suited for your usecase of transfering a small configuration file.
node('master') {
// Create temporary stash package on master
stash name: 'MyConfiguration', includes: 'SomeFile.yaml'
}
node('MySlave') {
// Copy to and extract the stash package on slave
unstash 'MyConfiguration'
}
This expects 'SomeFile.yaml' to be in the workspace on master and will also extract it to workspace on slave. In case you want a sub directory of WORKSPACE, simply wrap the stash and/or unstash steps in a dir step.
Can I change the working directory during a Jenkins job for all successive steps?
My job's first step checks out a git project. Unfortunately this project has a mix of technologies; it's not a java/maven project (so the trick of 'mvn -f subdir/pom.xml' doesn't apply) nor is it a gradle project. So I'd like to change to a subdirectory of the checked-out project and start running Jenkins plugins, like invoking shell scripts, like running tox to test python code, like running docker to build images, etc.
Maybe Jenkins wants every step to begin in $WORKSPACE, and allowing a directory change during the job would break some vital assumptions?
I know this has been asked before. Similar questions but answers specific to maven:
Jenkins Maven Build -> Change Directory and
Jenkins: How To Build multiple top-level projects from a git repository?
Similar question but answer specific to gradle:
Change directory during a build job on jenkins
You can separate out job based on sub folders and use filter to checkout in you SCM configuration so only sub folder that you want for that job will get cloned in your workspace. As your first step of your build use batch/shell command to move all file from sub folder to workspace. and then run all the steps that you want.
I want to use Jenkins and store the configuration and the pipeline in my SCM(e.g. git). To do so, I created a directory, let's say "jobs" in the root of my project where I will store jobs.groovy files written as JobDSL plugin files.
Should I do all the things in a single job file, like fetching the source code, testing it, maybe building Docker images if necessary, then deploying on AWS cloud? Or for each operation, should I create different jobs? If so, then how can I create a pipeline using these job files?
look at jenkins configuration as code plugin. following link would be helpful
https://github.com/tomasbjerre/jenkins-configuration-as-code-sandbox
I would like to share byproducts of one jenkins job, with another one that run after.
I am aware that I can set "use custom workspace", but that would merge the jobs together; which is not what I want. I just need to move few files in a location, that are read by the next job.
So far I can't find out how you actually tell Jenkins jobs to look for a specific folder; since it does not have a concept of file system, beyond what is going on in the job workspace folder.
Is there a way to access the host file system, or declare a shared folder inside jenkins (like in the main workspace folder, which contains all the other jobs?), so I can copy and read files in it, from different jobs?
Where possible I would like to avoid plugins and extras; I would like to use what is included with Jenkins base.
I realize you want to avoid plugins, but the Jenkins-y way to accomplish this is to use the Copy Artifacts plugin, which does exactly what you want.
There are a variety of problems that you may run into when trying to manage the filesystem yourself. (How do you publish to a common location when running on different build nodes? How do you handle unsuccessful builds?) This solution uses Jenkins to track builds and artifacts. In the absence of a separate artifact repository, its a lot better than trying to manage it yourself.
To use Copy Artifacts:
As a Post-Build step, choose "Archive Artifacts" in the first job and enter the path(s) to the generated files.
Then in the second job, add a "Copy Artifacts from another project" build step to grab some or all files marked as artifacts in your first job. (By default, Jenkins will re-create the paths of the generated files in the second job's workspace, which may or may not be what you want, but you can change this behavior.)
Configure the Jenkins to run a Maven build, and deploy your artifacts with "mvn clean deploy" This will push it to an "artifact server" which you probably have, or if not, need to add / configure.
Then in your downstream job, also a Maven job, you configure it to depend on the same artifact that was published in the upstream job. This will trigger a download of the artifact from the artifact server and make it available to the build.
So I got few separated jobs in Jenkins. The first one gets the project from a Git repository, builds it and produces artifacts. And another one has to copy certificates from the first job and publish them to Artifactory (tried to make it using the Artifactory plugin). But the thing is that the Artifactory plugin's available only in the Build job, there's nothing like "Generic-Artifactory integration" in second job's configuration.
Does anyone know what are the requirements for making the plugin work in the Publish job?
You can write a small shell script leveraging Artifactory REST API and execute it in your second, non-build job.
I have done a similar thing with maven and a zip file. I have deployed a zip with a build step in maven calling a deploy:deploy-file and setting my Artifactory repository in settings.xml and deploying directly on my artifactory repository.