Job configurations are missing when copied from folders in Jenkins - jenkins

This is reproducible 100%.
We are working on different branches of the release, but each branch should run the same jobs, with some minor change. So ideally I want to copy all the jobs from one working branch to a new branch.
I select a New item -> folder and select copy from another folder.
The new folder contains the all the jobs from source folder, but all the job configurations are missing. In another word, I have jobs created just with job names, I need to refill everything else. This is essentially useless.
I googled and did not see any related errors. Anyone have any good advice on copy jenkins folders ? I am jenkins 1.651.3, ubuntu 14.04

I tried the same on jenkins 2.19.1 and worked with out the issue your are seeing.
The best way to create similar array of jobs for new branches is via groovy & using https://jenkinsci.github.io/job-dsl-plugin/
create a job where you execute a groovy script to iterate over a list of branches and creates jobs .
DSL plugin is available for jenkins 1.642 and above
Note that manipulating content in JENKINS_HOME is not advised and is typically restricted

I should also mentioned it turns out just our computer issue. Lack of ram. After we added more ram, it' all working perfectly!

Related

Jenkins Multibranch Pipeline can't find Jenkinsfile in subdirectory using svn

I'm trying to set up a build using Multibranch. I'm basically having the same problem as stated here, but our SCM is Subversion. The Bug in the Bitbucket Branch Source Plugin as described here can therefore be ruled out, especially since our Jenkins has the newest version installed anyway.
I tried to find a similar ticket regarding my problem, but couldn't find one, so here I am.
As this particular project is configured in a way that configuration files (including something like the Jenkinsfile) are to be stored in a subfolder, I don't know what else to try, apart from configuring individual jobs. I'd rather stick to using Multipipelines, however, as they help keeping the build jobs tidy.

Jenkins Deleting Workspace

I have Jenkins pipeline projects, and everything works fine as long as I run the project at least once per month. If I wait more than a month Jenkins will delete the workspace for that pipeline project, causing the project to do a brand new git checkout and compile. This results in a super slow build, since all of the intermediate object files/etc are regenerated from scratch.
I cannot find what setting in Jenkins is causing it to clean up these older workspaces. If I modify the pipeline to check out to a custom directory instead of the workspace directory then it works fine, so it doesn't appear to be the git plugin itself, or anything like that.
'Discard old builds' is disabled in the General settings for these projects.
Can someone point me to the setting that is causing 'older' workspaces to get cleaned up for some reason?

Jenkins pipeline checking out to new workspace when previous build has been aborted

So I'm running into a specific issue, I have a Jenkins Declarative Pipeline (from an SVN hosted Jenkinsfile) that is configured to not run concurrent builds and abort previous builds when a new build is triggered.
This works perfectly fine, however the problem I am running into is that Jenkins will re-checkout the whole repository to an #2 suffixed workspace directory for the subsequent build (this ONLY happens when a build is automatically aborted after a new one is triggered, if the first build ends successfully, it re-uses the same directory).
I've seen a ton of threads stating that this is by design, but from what I can see that's only when concurrent builds are enabled, but since it's not I'm confused as to what could cause Jenkins to not re-use the same workspace directory?
If the "why" I require this is necessary, I have a few large repositories (for Unreal Engine games specifically), that I need to build and as an optimization measure for the time in compiling, cooking and uploading the game, it makes perfect sense to cancel old builds but instead Jenkins decides to clean checkout 10+GB of game code and assets (20+ in the case of some other games) in another folder becuase it can't reuse a folder that's not having a job/build executed in it already 😅.
Happy to accept all possible solutions/suggestions as I'm getting a lil' tired of pulling my hair out.
I was facing the same issue with my pipeline. I tried deleting the aborted builds and restarted jenkins. I also deleted the #2 directories in my workspace and only kept the main directory out there. Post this,I didn't face the same issue. This could happen because of the jenkins cache. Make sure that your workspace is correctly reflecting the directory name mentioned in your jenkins file.

Jenkins - Deleting artifacts automatically

JENKINS
I am noticing that the every time I run one of my jobs in Jenkins, there are two files created in the /workspace/build/distributions dir. The two files have the extensions of .tar and .tgz. Every time, I run the job, another set of these files are created. So, if I run the job 3 times, there will be 6 files all together. I have noticed that during the dependency check phase, these artifacts slow things down. Therefore, I wanted to remove them automatically before each time this job runs. I have attempted the configs in the image below. In addition, I have tried the workspace cleanup plugin and that completely deleted the workspace. That is definitely not what I wanted.
Therefore, what would be the best way to go about this.
What scm plugin are you using? Some of the scm plugins allow you to clean the workspace before an update (e.g. SVN's "Emulate clean checkout" and Git's "Clean before checkout" options).
If you're not using a scm plugin, can you remove the files in a batch/shell script during the first build step?
Or perhaps you can go about it from the reverse direction. Can you get rid of the files as the last build step of the job? That way, they are gone when the next build comes along.

Running one build per artifact of another job

I have a jenkins job to create multiple debian packages. Each created package file is archived as artifact of the build. This works well so far.
Currently I am trying to to trigger multiple builds of another job, one for each created package file. This job should install each package in an isolated vagrant box and do some tests on it.
The question is how to trigger the builds. As it would be nice to parallelize the builds it is not easy as doing one build for all packages. The number of packages is not always the same, so it is very uncomfortable to duplicate the job for each package.
Thanks,
krissi
To act on every build of a project, you probably want "Promotions". Read about it here:
How to promote a specific build number from another job in Jenkins?

Resources