Is there an easy way to download the Travis CI build logs for all builds of a specific branch?
As mentioned here, it is possible to get a specific log file. This can be used in a bash loop over job ids,
for JOBID in $(COMMAND-HERE)
do
wget https://api.travis-ci.org/v3/job/$JOBID/log.txt
done
but I'm not sure how to get the job numbers associated with, for example, the develop branch of the repository.
Related
Gitlab-ci's default mode is to use git clone in every job in a pipeline.
This is time-consuming, especially since after cloning we need to install/update all dependencies.
I'd like to flip the order of our pipelines, and start with git clone + docker build, then run all subsequent jobs using that image without cloning and rebuilding for each job.
Am I missing something?
Is there any reason that I'd want to clone the repo separately for each job if I already have an image with the current code?
You are not missing anything. If you know what you are doing, you don't need to clone your repo for each stage in your pipeline. If you set the GIT_STRATEGY variable to none, your test jobs, or whatever they are, will run faster and you can simply run your docker pull commands and the tests that you need. Just make sure that you use the correct docker images, even if you start many parallel jobs. You could for example use CI_COMMIT_REF_NAME as part of the name of the docker image.
As to why GitLab defaults to using git clone, my guess is that this is the least surprising behavior. If you imagine someone new to GitLab and new to CI, it will be much easier for them to get up and running if each job simply clones the whole repo. You also have to remember that not everyone builds docker images in their jobs. I would guess that the most common way this is set up is either with programming languages that doesn't need to be compiled, for example python, or that there is a build job that produces binaries, and then a test job that runs the binaries. They can then use artifacts to send the binaries from the build job to the test job.
This is easy and it works. When people then realize that a lot of the time of their test jobs is spent just cloning the repository, they might look into how to change the GIT_STRATEGY, and to do other things to optimize their particular build.
One of the reasons of using a CI is to execute your repo in a fresh state. This cannot be done if you skip the git clone process in certain jobs. A job may modify the repo's state by deleting its file or generating new ones; only the artifacts which are explicitly documented in the pipeline should be shared between jobs-nothing else.
Using Groovy and Jenkins Pipeline, is there any method of listing all git repositories and commits checked out using checkout during the course of a build aside from manually parsing the build log? I'm aware that changeSets allows one to see what changes have been made between runs, and that by bookkeeping all of these commits, it is possible to piece together the last known set of commits that were successful in building, but deleting/losing any of these builds would result in an incomplete log and prevent reconstruction. I'd like to know if there's an easier way of obtaining a git configuration for a given build.
I have an svn repo and a certain Jenkins job for the stuff therein. Using Jenkins svn plugin's "include regions" feature, I can configure Jenkins to poll changes in certain folders or filetypes. But that is for triggering the job. When the actual job starts to execute, how do I know what were the files whose change triggered the build?
I can easily grep the answer out of svn log in a shell script if there is only one commit that triggers the build. But if there is an unknown number of commits causing my Jenkins job to start, I'm in trouble.
I'm asking this because I want my Jenkins job to run certain analysis ONLY for those files whose change triggered the build.
Multiple commits pushed at once also can execute the script. So I think you are in trouble already. So please maintain a file in job's workspace such that for every build at the end it will save its commit id. In your script, now check from that commit to current (HEAD) diff and check for changes in your files as per your constraints. And now run your job if all the conditions are met. Hope this helps.
I assume this may be fixed, but just in case, there is also the "Last Changes plugin" from Jenkins.
https://github.com/jenkinsci/last-changes-plugin
That makes a diff between what was in that environment, and what is about to be pushed and gives you the result.
We are using Jenkins,GerritTrigger setup for CI and it will start build for each commit though all commits came from single push. Since all changes are dependent on each other its enough to make a single build with all changes, but I don't see that option in GerritTrigger plugin.
I believe many companies use Jenkins and Gerrit combination and I am curious to know how they are handling these cases.
Example:
If a developer pushes below 4 commits at once to gerrit it will create 4 changes accordingly in gerrit say 1,2,3,4 and it starts 4 builds in jenkins for all commit
git log --oneline
e3dfdwd CommitD
5fgfdgh CommitC
df34dsf CommitB
a23sdr3 CommitA
Here 4 commits as a whole will pass all tests in jenkins but individually they will fail. Now jenkins builds will fail for A, B, C and will succeed for D as it will checkout A,B,C as they are its dependencies.
In this case though Commit-D is successful it can't be merged as its dependencies are not passed in Jenkins.
It seems reasonable from development to expect jenkins verification for each push instead of each commit. But GerritTrigger can run for each commit only.
Question:
Is there a way to inform jenkins to start build only for commit-D as it will have all dependencies C,B,A ??
Or can we start a build for each git push from development instead of commits?
Sorry if I had any info
I found a way to start build only for commit-D.
I have introduced a gerrittrigger job which runs immediately after every commit, this job will not do any clone/build/verification.
It will just do some set of verifications like, check if given change has needed-by change, dependency exists and dependencies on same branch etc
This job will trigger another main job which does real clone, checkout change, build, verification etc only for changes which pass all validations.
So this will start job always for top commit and approve/reject all dependent changes based on job result.
Though it has few limitations, we found this method is suitable for our workflow
Most companies don't use git and gerrit :) Most companies don't even use git, unfortunately. And most of the ones that do, don't use gerrit. I've consulted for dozens of companies: two use git, and neither of them have even heard of gerrit.
I don't think it's possible to get gerrit to think of pushes as if they were commits. Since each commit in a push can be separately reviewed and rejected, each commit has to be considered and built separately. If you don't want it to work that way, gerrit might not be for you.
Instead, you should squash your related commits locally before pushing them to gerrit. This will achieve the desired results.
I am looking to add some automated tests to run nightly on a project. Currently the project has a few jobs that create multiple builds of various components of the project.
The builds create rpm files, there are multiple jobs creating multiple rpms, I want to grab all of the rpms and install them and test them all under a single test job, there are lots of dependencies on each other. I can install via the command line but these rpms are stored on the Jenkins master machine.
This is as far as I have got;
I have set up the job in Jenkins
I have created a slave for the job to run on
I have used Jenkins to run a bash script on the slave (works)
What I want to do is the following;
At regular intervals (lets say once per day when I know builds have all completed) fetch the most recent passed builds of all the projects and copy them to the slave machine
Install the rpms using a script.
The script performs certain tests during the install (looking at logs etc...) so I want to collect these all and send the results back to Jenkins (may eventually perform other tests here too)
I want the status of the last build image to be determined by my own tests
I also want the test results, logs, etc... to be stored in the Jenkins test job so that we can view them and marvel at their awesomeness!
What I don't know how to do is;
How to copy the files to the slave? Should this be handled on the slave itself using wget or something, or does Jenkins have the functionality (plugin maybe) that handles this all for me?
How do I report my custom results back to the Jenkins job?
I only started working with Jenkins three days ago (the project and Jenkins build jobs are a lot older than that), apologies if I'm missing anything obvious.
UPDATE
I'm thinking a combination of these plugins might do the trick, I've not yet looked into these too much yet though.
Copy artifact plugin to copy the rpms from the latest stable builds of the other jobs
xUnit plugin to interpret some xml files that I generate during the test process
I didn't actually need any plugins for this. I simply set up the job to run on the slave, had a build step that ran some tests and generated an xml file (similar to the jUnit xml results) and then added a post build step to look at the jUnit results (even though the tests weren't jUnit tests).
This worked a charm and I have builds being marked as unstable if they fail tests that I specify, like did they install an rpm and did such and such happen.
I was able to get the latest stable builds as the latest stable builds are coppied to a file server anyway, any failed builds don't go there so that was simple.