I´m using Jenkins DSL, and I need to copy the workspace from one job to another, I´ve been looking for an API to do that without success.
Any idea how to do that using the DSL?
Regards.
It looks like you can use the clone SCM plugin
This plugin makes it possible to archive the workspace from
builds of one project and reuse them as the SCM source for
another project.
Which is supported by job-dsl
publishCloneWorkspace(String workspaceGlob) {}
Archives files for Clone Workspace SCM source.
cloneWorkspace(String parentProject, String criteria = 'Any')
Add a SCM source which copies the workspace of another project. Valid
criteria are 'Any', 'Not Failed' and 'Successful'.
Related
I have a multibranch pipeline in Jenkins and need to pass the archive from the pipeline to a job it builds. The copy artifact plugin doesnt seem to support multibranch pipelines. It deletes my source project every time I save. Is there another plugin I can use to get the archive to get passed to the job? Or is there something I need to do to get this plugin working with multibranch pipeline?
The copy artifact plugin doesnt seem to support multibranch pipelines.
Copy Artifact does not care about multibranch. From its perspective, a branch project is simply a job that is in some folder. And it does support folders. You just need to use the correct syntax for the source job. Last I remember, it supports either relative (e.g., ../multibranch/master) or absolute (e.g., /organization/repo/master).
I want to use a config file provided by config file provider plugin in a pipeline project.
However when I run a build step inside a slave. I get a "PermissionDenied" exception, The same runs in master however.
So question is thats the best possible way to share files between master and slaves. I may not be able to Copy to slave plugin as there doesn't seem to be pipeline support.
If you want to share files between stages or nodes you can use the stash - unstash methods. see the example here
If you want to share files between builds you can use the archive method and the Copy Artifact Plugin
I am new to Jenkins 2 and pipeline feature, and I am setting up a project to use the Jenkinsfile for pipeline.
I can see there are 3 workspace created:
project-xxxxx
project-xxxxx#script
project-xxxxx#tmp
When I run tox in pipeline, it complains about no tox.ini found, I suspect it's in side folder project-xxxxx which is empty, but the project files are inside project-xxxxx#script
Should I use checkout scm to populate the workspace with project files? Or am I suppose to use the project files in project-xxxxx#script and how do I do it properly?
Can someone please explain to me how those 3 folders work together?
You should not have to worry about the workspace in a pipeline. You start the build, you get a workspace and anything you checkout out or copy into it should be there.
How do you start the pipeline? Inline script, from scm or via a Multi-Branch or similar job type?
How are you getting files into your workspace?
I am using bitbucket team project ( Bitbucket Branch Source Plugin),
I have configured team project with my bitbucket username, I twas taking all repo from the bitbucket with branch in jenkins projects
Now I need to copy artifacts from one project to another project but when I have used copy artifacts from another project spinet generate I have tried to given the project name but it was not working ,
For example I have two repo in bitbucket project with branch each
Then I have created bitbucket team project see following image
This was dynamically created from the bitbucket team project with repo form the bit bucket
In all project there is branch , see attached screenshot
1) jenkinsdemoproject
2) marcurialproject
I have written script in jenkinsdemoproject project which will create archive
I need to copy that artifacts to marcurialproject project
but When I have used copy artifacts from another project jenkin plugin and added project name it was not copying artifacts , as project was dynamically created by bitbucket team/project ( Bitbucket Branch Source Plugin). Does any one have idea how to use copy artifacts for this plugin for multi branch pipeline project?? Please provide me your suggestion
Thanks
Pratik
Install Copy Artifact Plugin, it should work. The call must be something like this:
step([$class: 'CopyArtifact', filter: "artifactName", projectName: "JobName")
I think that setup will work similar as the multi-branch pipeline job.
That means the job name will include the parent job in which the auto-generated jobs exists.
For example, a multi-branch pipeline job called pipeline, with a branch develop will result in a job name "pipeline/branch/develop".
And if your branch contains a slash, for example "feature/some-feature" the / gets encoded to %252F. So the job name will then be "pipeline/branch/feature%252Fsome-feature".
I have a number of projects that need to be analysed by SONAR from jenkins. These projects include ant and maven projects. I have created a separate job for each SONAR analysis in jenkins.
Is it possible to have a single jenkins job in which I can pass some parameters from each individual sonar job and then see the dashboard?
If so, how do i go about it?
This solution is for Subversion and Maven.
Install the Parameterized Trigger Plugin
Create a Maven job for the SonarQube analysis, eg. _common-sonar with these settings:
Source Code Management: "Subversion", Repository URL: $PREVIOUS_SVN_URL, Check-out Strategy: "Always check out a fresh copy"
Build: Goals and options: install
Post-build Actions: "Sonar"
For the job you want to run analysis on add a Post-build Action "Trigger parameterized build on other projects" with these settings:
Projects to build: _common-sonar
Add Predefined parameters: Parameters: PREVIOUS_SVN_URL=${SVN_URL}
Now when the job-to-analyse completes it triggers the analysis job. The analysis job checks out the same SVN URL which was used by the first job.
This solution works without scripting or copying workspaces but there are quite obvious limitations and non-ideal features:
the build command is always only mvn install
the SVN checkout may be from different revision than original build
checkout and build are always done from scratch
I didn't consider ant at all here.
Improvement ideas are quite welcome!
Late improvement edit:
Instead of using a maven build ( in _common-sonar), you may also use SonarQube directly by invoking a Standalone SonarQube analysis
Additionally to the SVN URL you can add a parameter for the build tag and project name to use in sonar. Simply add
NAME=YOUR_PROJECT_NAME
BUILDTAG=$BUILD_TAG
beneath the PREVIOUS_SVN_URL parameter.
In your _common-sonar you can use it with ${NAME} and ${BUILDTAG}.
In a similar need I had once, I created a single job, which pulled sources of several projects (each to its own sub-folder in job's workspace).
Then, I wrote a simple shell script that looped over all directories and ran the sonar analysis.
The job had the sonar post build plugin which showed an aggregated report.
Unfortunately, I don't have an example as this was some years ago, but you can make it work.
I hope this helps.