Shared library cloned on build history and workspace - jenkins

I have many jenkins jobs using a shared library
#Library('my_shared_library') _
On the folder of each build of each project using the library, there's a local copy of it, for example:
jenkins/jobs/my_project/jobs/builds/10/libs/my_shared_library
That means, it's copied on each build's history.
Plus, I see it also on the project's workspace, under:
jenkins/workspace/my_project#libs/my_shared_library
On this last directory there's sometimes several copies of this, like :
my_shared_library
my_shared_library#2
my_shared_library#tmp
...
Question is, is there some special way of handling this so that there'll be less copies? Or, why's there a copy on the project's workspace and one on the build history?
This ends up creating lots of files, which caused my jenkins to run out of inodes.

Related

Jenkins - how to add additional folder with subfolders, e.g. Images

I have a web project that Jenkins is building perfectly and pushing to Octopus Deploy.
I now have an additional folder, with subfolders, e.g. Images, which I need to include.
This is not directly part of the .net build and we used to copy it manually afterward.
Do I need a specific plugin which I can use to select the folder to include?
Which plugin?
Where in the build process does this plugin run?
The build and deploy to octopus is done in one step -
where do I fit in this additional folder to be included in the push to the octopus?
This is not directly part of the .net build and we used to copy it manually afterward.
If your Jenkins server can access that addition folder in a shared path, add a pre-build step which, as an "Executable Windows batch command" step, would copy that folder into the Jenkins workspace.
No plugin needed here.
Once that is done, you would still need to modify Octopus accordingly, to take into account that new copied folder.
See:
"How to add a folder to a nuspec file"
"How to include directories recursively in NuSpec file"

Download full workspace from Jenkins build

I am new to jenkins and I have tried downloading a zip archive of this workspace in jenkins, but I only get a part of it. Source folders like tensorflow or tools are not present inside the archive. Is this normal ?
If so, how do I get all of them inside a zip file ?
Use Archive Artifact plugin, to add your workspace into archive folder which will make it easily down-loadable.
But be aware that, an artifact in the Jenkins sense is the result of a build - the intended output of the build process.
A common convention is to put the result of a build into a build, target or bin directory.
The Jenkins archiver can use globs (target/*.jar) to easily pick up the right file even if you have a unique name per build.
putting a complete workspace into it will take lot of time.

Copy generated folder from one job to another in Hudson/Jenkins

I have two jobs in my Hudson configuration. Let's call them A and B.
Job A was created specifically to generate a folder application_home. This folder is a ready-to-be-used-in-installations-application-home-folder.
Job B is the "pack-all-together-for-installation-job". It needs to copy the application_home generated by job A to generate the installer. My problem is that after some investigation, I was not able to do this in a simple way.
I could use shell script, but then I would need to know job A path plus where its workspace is to get application_ home folder.
Is there a simpler way to do this?
EDIT
I know Copy Artifact Plugin. The problem is that it only copies artifacts. I need to copy the folder application_ home as it is, because it's already in the structure to be used in the installer. If there's a way to use this plugin to copy only the folder, I haven't found it.
EDIT 2. Answer:
Ok, you can do it using Copy Artifact Plugin. You need to
Set its configuration to "copy from WORKSPACE of latest completed build".
Set Artifacts to copy option the folder like this: target/application_home/**
Set Target directory to where you want to somethine like: installation_bundle_folder/application_home.
and it's done :)
You could try the Copy Artifact Plugin.
Then you could add a build step to "pack-all-together-for-installation-job" that would copy application_home to the packaging directory. There is an option to only include the latest stable build of Project A.
Another alternative is to have a post-build step for a successful Project A build that scripts the copy of the application_home over to where Project B will use it. You can use the WORKSPACE environment variable to get the absolute location. (See here for a list of environmental variables).

Archiving artifacts not in the workspace when build fails

When an ANT build step fails in my build I'd like to archive the logs in order to determine the problem. The relevant logs, however, are not located in the workspace, so I have to use a full path to them.
The standard artifact archiving feature does not work well with full paths, so first I have to copy the logs into the workspace within some build step so that I can later archive them. I do not want to incorporate the copying code into the original ANT script (it does not really belong there). On the other hand, since the build step fails the build I can't execute the code that copies the artifacts into the workspace as a separate build step as it is never reached.
I am considering using ANT -keep-going option, but how will I then fail the build?
Any other ideas (artifact plugins that handle full paths gracefully, for example)?
Update: I've worked around the problem by creating a symbolic link in the workspace to the directory that contains the files to be archived. Kludgy, but effective.
I would recommend using Flexible Publish plugin in conjunction with the Conditional Build Step plugin.
The Flexible Publish plugin allows you to schedule build steps AFTER the build steps have normally run. This allows you to catch both successful and failed builds and execute something - say a script that copies the files from OUTSIDE the workspace to INSIDE the workspace. The Conditional BuildSet plugin allows conditionalizing the steps so that they only run when the build fails. Using these two plugins, you can copy the files into the workspace upon failure, then archive them with the usual Jenkins mechanisms.

Jenkins: How To Build multiple projects from a TFS repository?

I have set my workspace directory to C:\jenkins_builds\workspace and I want to build ProjA and ProjB, each having a local workfolder (same as project name).
When fetching the source code from my repository, the first two things the TFS plugin does are:
tf workspace -new %workspace-name-A%;%user-name% -server:%my-server%
tf workfold -map $%branch% ProjA -workspace:%workspace-name-A% -server:%my-server%
Which goes fine when building ProjA. The problem is, the first command maps the root directory from the repository directly to my C:\jenkins_builds\workspace directory. The second command does what I actually want, i.e. mapping %branch% to the ProjA subfolder. Later on, when building ProjB, the first command fails (and consequently the build) with the following error message:
The path C:\jenkins_builds\workspace is already mapped in workspace %workspace-name-A%;%user-name%.
OK, it seems like a bad idea to map the root directory to the work directory. But why does this automatically happen when the TFS plugin runs the workspace new line? Currently I have to clean things up between building ProjA and ProjB by running the -unmap command.
My team is using Team Foundation 3.0.
We have the same situation and there are 2 ways to solve this:
use different workspace-root-directories for the two builds
This results in the need for two checkouts => double the space and slower, but better isolation between the two builds
"hardcode" the workspace name to the same for both builds
By default jenkins creates a workspace containing the build name, which can be changed in the "advanced" section of the TFS config, and then you can use the same workspace-/workfolder-mapping for several builds - in our case we called them ProjectName_${NODE_NAME} so it even works on several nodes

Resources