I have a problem with artifact transferring from source build to triggered build.
What I have
Jenkins plan A creates artifact;
Jenkins plan A calls build on other project (plan B) and pass some parameters, includes file's location as file.zip=path_to_artifact/*.zip;
Triggered Jenkins plan B takes the parameters and failed - it doesn't found the .zip artifact. I as understood, such parameter passing is unsupported.
P.S.: Jenkins plan B has file parameter as file.zip.
Question
How I can solve this problem?
Is it possible to solve it without pipeline script?
Related
the company where I work for is evaluating jenkins 2.71, in particular the pipeline and blue ocean plugins. We already tested also GoCD and we need, as in GoCD, a way for a pipeline to automatically fetch the artifacts from 2 other pipelines (taking the last successful result of each one of them), here our case.
We have these initial pipelines (build & run tests), which reflect 2 projects:
frontend, ~ 15 minutes
backend, ~10 minutes
I created a pipeline called configure (~1 minute), with e.g. a parameter called customer-name, which takes backend and frontend files and puts them together, then applies specific customer specific configurations and customizations and produces deployable artifacts. Instead of "customer-name" I could also parallelize this job to create all the artifacts for each customer at once, separated in different directories.
The next pipeline would be to deploy them on different test servers separated for each customer. This could be also part of the same configure pipeline, we still have to see how to put things together in jenkins...
Ideally, I need configure pipeline to be triggered automatically (or also on demand) after each frontend or backend success and take as input the last successful artifacts from these 2 pipelines, but not just having the last successful build, we need as dependency the git branch name.
E.g. we have:
backend branches:
master
release/2017.2
frontend braches:
master
release/2017.2
In the pipeline editor, I found a Build Triggers option and set it as follows: Build after other projects are built > Projects to watch: frontend, backend > Check Trigger only if build is stable or better in my test environment full of failures Trigger even if the build is unstable.
Searching further, I found Copy Artifact Plugin
But now the big question, how to fetch the last successful artifacts from these pipelines with the same git branch name?
Because we don't want to mix e.g. a backend build of "release/2017.2" with frontend "master", it has to find as the last successful build having the same relationship or parameter or whatever you wanna call it, in our case the association is the git branch name.
Is it possible to achieve this? If yes, how?
The copy artifact plugin seems to work in a freestyle project. Would it work in a pipeline? That's also a concern...
Thanks
Yes, the Copy Artifact plugin does work in both freestyle and pipeline projects; pipeline uses the copyArtifact function that I referenced in my comment. Note that if you go to the Pipeline Syntax link, it's kind of hidden: you have to first select "step: General Build Step" from the drop-down, then it will give you the Copy Artifact pipeline command builder.
I'm going to assume that your frontend and backend projects are built as multi-branch pipelines, as that would probably be easiest to maintain so that you don't have to keep creating new projects for every release. You can reference these projects from other projects by referencing <project name>/<branch name> (sometimes I've had to replace the / with %2f instead, I think mostly on freestyle projects). You could then set up your configure project as a parameterized build (either pipeline or freestyle), say with a string parameter of PROJECT_BRANCH_NAME. Then put in the following in your frontend/backend project pipeline scripts to trigger a build of your configure project
build job: 'configure', parameters: [[$class: 'StringParameterValue', name: 'PROJECT_BRANCH_NAME', value: ${env.BRANCH_NAME}]]
Then you should just be able to make your configure project reference the frontend/%PROJECT_BRANCH_NAME% and backend/%PROJECT_BRANCH_NAME% (or ${env.PROJECT_BRANCH_NAME} in a pipeline script) when copying the artifacts.
Also, is there a particular reason why you're evaluating specifically Jenkins 2.7? 2.7 is a year old now, and there have been a few new LTS releases since then. I'd recommend staying reasonably up-to-date unless you know there's a specific reason you want 2.7.
Current scenario: Build and deployment happens in development environment and the code is checked in to GIT and the JAR file is placed in Nexus. Then a change request is raised to deploy the same to the QA environments. The CR is attached with two parameterized text files (One of which contains the nexus path and other contains website URL) which act as input for parametrized build along with selection of environment. Run deploy
Target Scenario:We want to remove the CR part and in doing so we want a file (containing parameters which were attached in CR) which when pushed to GIT, a copy-paste should happen to the parametrized Jenkins job in respective parameters and select the environment from dropdown.
What is the best way to achieve this, either by creating another Jenkins job which can read the parameters from the file or is there any other way.
P.S. We don't want to make any editing in the existing Parameterized Jenkins jobs.
Using the Jenkins GitHub Plugin, you can create a separate job with a GitHub build trigger. By adding the GitHub repo (where the parameter file is pushed) to this Jenkins job, you can process the file to get the parameters you want in order to kick off the appropriate Jenkins jobs.
For Jenkins to process the parameters, one option is to use the EnvInject Plugin. (As suggested in this answer.) Another suggestion: Extended Choice Parameter Plugin (from this answer).
I'm using a Jenkins job to trigger a few downstream jobs. I pass parameters through properties file. But there's a file that was uploaded when the upstream job was submitted that I want to pass to the downstream jobs. There is an option under Copy Artifact Plugin that allows copying from the workspace of the latest completed upstream job.
The problem is that my upstream job is blocked on the downstream jobs and cannot complete before them. This is the same reason that I cannot copy the file as an artifact, as archiving artifacts is only possible as a post-build action (AFAIK).
Is there any way around this problem?
You could you stick the uploaded artifact in the upstream job into an online file repository like Artifactory or an external network/file share and access it in the downstream job?
That way, you just need to pass in the path of the file rather than the entire file and can download it in the child.
You could even use the build number of the upstream job as the unique identifier of the artifact, so you just need to pass the build number down to download it.
http://myonlinerepository/{build number}/upload.zip
I'm running a jenkins job on a slave and i want to store the generated artifacts in the server.Since the job is currently running on the slave the artifacts are also created there.
I tried using post build actions --->archive the artifacts.But it throws the following build error
ERROR: No artifacts found that match the file pattern "**/*.gz". Configuration error?
ERROR: '**/*.gz' doesn't match anything: '**' exists but not '**/*.gz'
Any help in this regards is highly appreciated.
Sounds like Copy To Slave Plugin is what you need
It can copy to slave (before build) and from slave (after build)
Copy files back to master node:
To activate this plugin for a given job, simply check the Copy files back to the job's workspace on the master node checkbox in the Post-build Actions section of the job. You then get the same two fields as for the Copy files to slave node before building section (note that label in the screenshot is old):
if you want to copy artifacts from JobA to the workspace of some other Job, you can do it using the Copy Artifact Plugin which is very simple to understand.
In case you just want to archive the artifacts already in JobA, then you are already in this direction and need to check what you are missing... are you sure that the artifacts are in the current workspace?
Doron
I have a project where part of the build process is to create a native library on a remote machine. This is currently a manual process outside of the CI builds made by Jenkins.
The setup in question is that the Jenkins master server build a GIT based maven project, which has a dependency to a native library which can only be built on a specific machine. Jenkins can't compile this module, and because of this, it is currently a manual process.
I would like to install a Jenkins slave on the machine that creates the native library, and returns the compiled files to the Jenkins master, without handling any other parts of the build.
I am having trouble figuring out if this is even possible. The number of articles i have found on the subject discusses Jenkins slaves as a means of distributing the build, but i want the slave to take responsibility for a small part of the build process, and nothing else. The Jenkins master should just send the build request to the slave and wait for the result, instead of trying to compile the code itself.
I do exactly the same. My setup, very similar to what Mark O'Connor and gaige are advising, and I am using the Copy Artifact plugin.
job A: produces a zip file on a Mac
job B, runs on slave B - Windows machine, takes the zip as input and produces an MSI
Here's the important part in the config of job B:
restrict the job B on the proper slave using labels
make sure job B happens after job A
make sure artifacts from job A are sent to job B before your build
build your stuff
archive artifacts produced by job B
Delegating part of a job to a slave is something that would have to be done external to Jenkins, for example, using ssh.
However, as #kan indicates, you most likely want to extract the native library build as a separate job and then have that job execute on a particular slave, or any slave that meets a specific criteria.
To do this, my suggestion would be to use Labels in the node configurations to determine which slaves can be used for building that particular job.
In Jenkins > nodes > <slave node>, use the Labels property to set one-word labels that indicate your specific requirements, such as the OS or processor type.
Then, in the jobs that are node-specific, check Restrict where this project can be run and set the Label Expression to something that meets your criteria. If the criteria is simple, it will just be a single word, if you need a boolean, you can use those as well (such as OSX&&Lion in our case).
I believe this is all in the standard version of Jenkins, without need for a special plugin. Leave me a comment if it isn't and I'll try and diagnose which plugin enables this functionality.
This is problem is solved by using a binary repository manager to centralize your software artifacts. Personally I use Nexus, but it could be something as dumb as a remote file system.
The idea is to publish the built artifact after each Jenkins job (if you don't like Nexus, you could use one of the Publish over plugins) and retrieve it as a build dependency in the next job.
This approach means it longer matters where the build executes, and has the added advantage of decoupling the build of each module component.