How to store jenkins build output with artifacts? - jenkins

I have a Jenkins (which I'm new to) system that has a few constraints imposed on it:
master/slave arrangement
slave is Windows 7
installing plugins is likely to experience significant resistance
builds are to be archived to a Windows share
The goal is to have the output of the build and the log of the build archived to:
//server/share/archive/%BUILD_TAG%/
I'm using the post-build step Send build artifacts to a windows share to push the binaries to the archiving server, but I've yet to find a way to arrange for the build log to end up there too. I have discovered that the console output is available at http://jenkins/job/$jobTitle/$buildNumber/consoleText, but I'm not sure how useful that is going to be - I can't cause anything to run on the archive server to run. I'd like to maintain the separation that the master Jenkins knows about the slave and the archive, but the slave and archive are unaware of each other (that's assuming the Send build artifacts... step is running on the master rather than being pushed down to the slave to action).
Another approach I explored was to have Jenkins store the artifacts locally (via Archive the artifacts), but then I don't see a way to promtply push the result + log to the external server.
It seems that Jenkins is very flexible and that this scenario is likely already catered for, but I've yet to find out how.

Using the Post Build Task plugin, you can get the console output with a wget command:
wget -O console-output.log ${BUILD_URL}consoleText

Related

Should I use a build user or jenkins user for building projects? ie who should own the build artifacts in jenkins?

What is standard for building large projects in jenkins (eg. 80Gig of dependent jobs building libs and binaries from one repo)?
if you split this large build into jobs that build libs and others that build downstream apps, should you use jenkins user to build the project, or setup a separate build user to build the project?
What if you farm out to a remote agent using SSH?
jenkins user is not allowed to remote by default into another host because of this /bin/false here:
$ grep jenkins /etc/passwd
jenkins:x:996:992:Jenkins Automation Server:/var/lib/jenkins:/bin/false
so does this mean all farm jobs run as a build user or should you change this setting to /bin/sh to complete the ssh session as jenkins user? Or use only setup JNLP agents?
If you build the libs on the jenkins master should the .o and binaries be owned as jenkins or not?
If you build on a dedicated jenkins master should the executors be agents setup to use build user rather than the jenkins user?
If using a NFS type mount for sharing build artifacts how does that look?
Related to the 2 previous questions - all local jobs would have build ownership under jenkins user. So does that mean you use a build user and use remote node to localhost as the build user just to make the artifacts all have the same user for all local and remote agents to use?
These may sound stupid questions but I can't find any guidelines on who should be the build owner or what is best practice in making a very large single git repo project build in a sane way (repo owners do not want to split the code up into different repos because of static linking).
In our experience (very large monorepo, 250+ slaves):
We united several jobs into one big job, with parallel stages where applicable, so independent things can be built at the same time on different slaves (to cut time). Thus, it is easier to follow what failed and why, and you have all the artifacts in one place, and there's one Jenkinsfile to follow.
All our slaves are set up as JLNP, and when they reboot they start jenkins-agent. There's no jenkins user on our slaves.
As you are supposed to pick all the artifacts and archive them in the end, preferably cleaning the slave into zero state, it does not matter who owns that, and you can always change it with chown.
NFS would not be a great idea for this in our place, as it would be severely constrained by network and disk usage. We use Docker registry for docker images, but Artifactory might work if you're not using Docker. minio would be another option.

Jenkins - Copy Artifacts from upstream job built in different node

There is a job controlled by Development team which built in a different node. I am on Testing team who want to take the artifacts and deploy on test device.
I can see those Artifacts from dev are stored in some path in dev's node. Does it means it must first archived in Jenkins master before I can copy it to my job?
I am using Copy Artifact plugin and constantly getting the error
Failed to copy artifacts from <dev-job> with filter: <path-in-dev-node>
*Some newbie question since i just moved from TeamCity
You probably want to use: Copy Artifact plugin.
Adds a build step to copy artifacts from another project.
Consider also, the Jenkins post-buid step "Archive the artifacts".
If you copy from the other job's workspace, what happens if another job is in progress or the workspace is wiped? That step copies them from the node to the master and stores a copy along with the build logs, etc. That makes them available via the UI as long as the build logs remain. It can take up space tho.
If you do use archive artifacts, consider using the system property jenkins.model.Jenkins.buildsDir to store all the build logs (and artifacts) outside of the jobs config directory. Some downtime and work required to separate the two (config / logs) .
You may also want to consider using a proper repository manager (Nexus / artifactory)
Finally, you may want to learn about using a Jenkins pipeline rather the relying on chained jobs, triggers or users and so forth. Why? 'cos it's much more controlled and easier to maintain.
ps: I'm not a huge fan of artifactDeployer, but it may work for you.
pps: you may want to review this in depth answer: Jenkis downstream job fails to find upstream artifacts

Deploy web app via Jenkins

I have recently started to mess about with Jenkins and am unsure how to deploy my web app to a basic server. I've gotten into the Pipeline (https://jenkins.io/doc/book/pipeline/) and it seems like a fantastic way to work.
Where I'm a bit stuck is in two spots:
Once my repo is in my workspace within Jenkins, how do I prep it so I am only deploying the files necessary for the application? For example, I don't need my src/ directory or my Vagrantfile when I'm deploying things.
How do I deploy my app to the server? I see examples all over the place, but I am getting a bit lost since there seems to be so many ways to do this. I'm assuming scp or something like that...?
To build off of #2, is there a way to deploy web apps as transactions (in one shot) rather than file-by-file?
Please let me know if I can provide any information for potential answers!
I can't speak to your specific use case but a common way to do this is the build-and-deploy model, where you will have 2 Jenkins jobs. The "build" job will check out from source, run build commands such as maven or make, and lastly will "archive" the build artifacts. The latter is an option under the 'post-build actions' tab at the bottom.
In the "deploy" job, you will grab the artifacts of your choice. You can fetch a single file, all of them, and everything in between. This requires use of the 'Copy Artifact' plug-in and it allows you to copy files generated by other jobs. Now you can run your usual deploy script in the 'Execute Command' box. Most command line paradigms are supported out of the box such as setting environment variables.
The instructions above assume that you want to run your application off of a host that you've provisioned as a Jenkins slave.
Use artifacts as mentioned by Paul Back, or a 3rd party artifactory server as in video
This is always tricky and error-prone. Why not spin up a fresh server with new release (humanly verified once)
Jenkins & Ansible is the answer here. This is how I deploy to production, since I am in no need to use anything like Docker (too many issues with particular app) so have to run the app natively. Quick example would be
You monitor a specific branch in gitlab / github or whatever else and then call a webhook on push / merge etc on that branch, at this point you deal with anything you need to do by running a playbook on the jenkins job that monitors that branch (jenkins).
in my case jenkins and ansible run on the same server. Jenkins runs the ansible playbook that does whatever I need to do.
for example with ansible, I copy certain files that need to be there, run configs / change filenames etc. setup nginx, run composer,
you get the point.

Delegate specific part of build to slave

I have a project where part of the build process is to create a native library on a remote machine. This is currently a manual process outside of the CI builds made by Jenkins.
The setup in question is that the Jenkins master server build a GIT based maven project, which has a dependency to a native library which can only be built on a specific machine. Jenkins can't compile this module, and because of this, it is currently a manual process.
I would like to install a Jenkins slave on the machine that creates the native library, and returns the compiled files to the Jenkins master, without handling any other parts of the build.
I am having trouble figuring out if this is even possible. The number of articles i have found on the subject discusses Jenkins slaves as a means of distributing the build, but i want the slave to take responsibility for a small part of the build process, and nothing else. The Jenkins master should just send the build request to the slave and wait for the result, instead of trying to compile the code itself.
I do exactly the same. My setup, very similar to what Mark O'Connor and gaige are advising, and I am using the Copy Artifact plugin.
job A: produces a zip file on a Mac
job B, runs on slave B - Windows machine, takes the zip as input and produces an MSI
Here's the important part in the config of job B:
restrict the job B on the proper slave using labels
make sure job B happens after job A
make sure artifacts from job A are sent to job B before your build
build your stuff
archive artifacts produced by job B
Delegating part of a job to a slave is something that would have to be done external to Jenkins, for example, using ssh.
However, as #kan indicates, you most likely want to extract the native library build as a separate job and then have that job execute on a particular slave, or any slave that meets a specific criteria.
To do this, my suggestion would be to use Labels in the node configurations to determine which slaves can be used for building that particular job.
In Jenkins > nodes > <slave node>, use the Labels property to set one-word labels that indicate your specific requirements, such as the OS or processor type.
Then, in the jobs that are node-specific, check Restrict where this project can be run and set the Label Expression to something that meets your criteria. If the criteria is simple, it will just be a single word, if you need a boolean, you can use those as well (such as OSX&&Lion in our case).
I believe this is all in the standard version of Jenkins, without need for a special plugin. Leave me a comment if it isn't and I'll try and diagnose which plugin enables this functionality.
This is problem is solved by using a binary repository manager to centralize your software artifacts. Personally I use Nexus, but it could be something as dumb as a remote file system.
The idea is to publish the built artifact after each Jenkins job (if you don't like Nexus, you could use one of the Publish over plugins) and retrieve it as a build dependency in the next job.
This approach means it longer matters where the build executes, and has the added advantage of decoupling the build of each module component.

Apple CI / Xcode Service and Jenkins

Is there a way/plugin to integrate the new Xcode service and/or the new Apple CI with Jenkins?
Why?
A main issue with having a Jenkins server + an OSX build slave connected via ssh is that Unit Tests do not work, as the iOS Simulator needs a graphical environment which is not present in this configuration.
I hope that it is possible to integrate the Xcode service (which supports Unit Testing) with Jenkins.
It could be that using the Apple CI will be enough for my needs, but this question aims at the integration of the Xcode service with Jenkins.
What I do already know
I have experience with the existing Xcode Jenkins plugin, but it seems not to support the brand new Xcode service or the new Apple CI. I'm especially keen on unit testing via CI (which did not work properly over a ssh session with the old way).
What I want to know
I'd like info on the following issues currently not working with Jenkins and an ssh connected build slave:
Unit Tests on a headless system
Acceptance tests with Frank or similar
Automatic Provisioning Profile updating (Apple CI does that)
And info on things that currently do work fine with Jenkins and an ssh connected build slave and still should work with an Xcode service integration:
Builds of different build configurations (Release, Debug, TestFlight) / schemes
Automatic Build number increment
(With Jenkins I can set the build number in my project to ${BUILD_NUMBER}, and Jenkins sets this environment variable according to its build number. When the Apple CI does the builds most probably it will set the build number instead.)
Handling/synchronization of Build number between Jenkins & Apple CI
Accessing build products of the Apple CI from different Jenkins Jobs
e.g. for a Job to upload to TestFlight
Backup of builds
Automatic builds on git push to a specific branch
E-Mail notifications
Some additional questions/hints
I'm not sure whether the Apple CI == Xcode service or if the Apple CI just uses the Xcode service. In the latter case the Xcode service just would be like an intelligent build slave, and Jenkins maybe could use that to do builds and tests, but manage build numbers and products by itself.
I'm aware that the Apple CI is an separate CI, and integrating several CIs with each other is not the most easy or useful way to go. I just fear that the Apple CI is not flexible enough for my needs (see above), and that the old way with Jenkins bears some problems (see above).
I believe you are going to have to choose either jenkins or xcode server, not both. I don't know much about xcode server, but I do know about jenkins and xcode 5.
Builds with different configurations:
In the xcode plugin, you can set the scheme to use.
Automatic Build Number Increment
I added a parameter to my jenkins job called XCODEBUILDNUMBER. And whenever I start a build, I simply copy the build number out of my xcode project (I increment it manually. Mine looks like 080813A) and paste it into the XCODEBUILD parameter. I use this to name my output files, etc. There are plugins for jenkins that can automatically increment your build number, but they don't integrate, or sync with xcode.
Handling/synchronization of Build number between Jenkins & Apple CI
As I said before, I don't know of a way to sync the build numbers, but I just thought of a possible solution. You could use the command line tool plistbuddy, to set the build number in your info.plist, as a build step in your jenkins job.
Unit Tests
I have not successfully made unit tests work with Xcode5, but I know that the xcode plugin for jenkins supports it. I believe that the absence of the "Test After Build" key in the project settings may have something to do with it. If you make it work, i'd love to know. (I am also keen on making this work)
Acceptance Tests
From what I can tell, Frank is a command line tool. You can easily integrate it into your Jenkins job, and I believe that it will fail the build if your tests don't pass.
Accessing build products of the Apple CI from different Jenkins Jobs
Not completely sure what you mean, but with jenkins you can archive your build product (a .ipa), for later download and upload to a service like testflight. Again, I don't know much about Xcode Server (CI).
Backup of builds
As I said before, jenkins can archive your build product. Also, I use the the ${BUILD_NUMBER} variable in my build products directory, so I have a different directory for each build. This directory is also backed up to my Time Machine, and important builds copied to my web directory.
Automatic builds on git push to a specific branch
With the jenkins git plugin, you can make jenkins poll your scm in a interval specified by you, and can trigger a build on a change.
E-Mail notifications
I am sure that there is a plugin for this. (that emails you when a build failed/succeeded. in fact, this may be built-in)
In Closing
The xcode CI is a full independent CI, that may be hard to integrate with jenkins. Personally, I would recommend jenkins simply due to its extendability. Sorry I don't know much about Xcode Server.
I've got unit tests running in Jenkins with Xcode 5 on my OS X build slave. Instead of using the Xcode plugin, I run as an execute shell build step:
xcodebuild test -scheme <scheme> -configuration Coverage -sdk iphonesimulator7.0 -destination OS=7.0,name="iPhone Retina (4-inch)"
My coverage configuration is the exact same as my Debug config, except Generate Test Coverage Reports is set to YES, and Instrument Program Flow is set to YES. This is done so test coverage files are created. Due to a bug in Xcode 5, I call __gcov_flush(); in the tearDown of all my tests. I pipe the output of this xcodebuild command into ocunit2junit to get test reports in Jenkins.

Resources