What does jenkins look into the project to start unit tests - jenkins

I have looked into jenkins tutorials and all most all of them mention that we should provide with the URL to the git repo.
Fine.
But once jenkins has an access to the git repo, what part of project does it look into to figure out which tests should be run or wether to run them at all etc ? Is it some configuration file in the repo ?

Guess that depends on what kind of project your repo is. If I understand the question correctly. The provided url gives Jenkins the information to do a git clone url which checks out the project in Jenkins workspace.
Then according to the type, lets say it's a Maven-project, you fill in the goals you'd like Jenkins to run locally. Usually clean test. It is then run at top level, root of the project, guessing it will find a pom.xml there. If not you'll have to tell it where to look.
A more clearer answer would perhaps be easier if you told what kind of project you'd like to build.

Related

GitHub Action: Pass actions/setup-java#v3's maven repo to own action

I implemented an OWASP Dependency Checker like https://github.com/dependency-check/Dependency-Check_Action. But I do not use the CLI, I use the maven plugin. Every night a docker image is build via the GitHub Actions to get the latest vulnerabilities database, which is stored in the .m2 directory.
So, now I want to use this action in a GitHub Workflow. Everything works fine apart from one use case: To build and install a maven project to the local .m2 repository and enable my action to access these artifacts.
Extract from workflow file
- uses: actions/setup-java#v3
with:
distribution: temurin
java-version: '17'
cache: maven <- local maven repo
- mvn install ...
- uses: mycoolaction#main <- create another intern maven repo in the docker image. this want the artifacts from the prev mvn install
Can I pass the maven repo from the runner to my action? Or copy the artifacts?
Things I have tried:
Docker Image is built at start up. Artifacts do not exists at this point. I cannot copy the files from the action - and maybe here are some security gates?
Actions like https://github.com/addnab/docker-run-action and mounting the .m2 als volume. Did not work out. But probably worth to investigate further.
Do you think it is possible to achieve the merge of the two .m2 repos (runner and action)? Am I missing something?
P.S. I want to maintain the OWASP check via gloabl workflows and not maintaining every single pom in the organization.
While I am writing this, I think of a solution via parent pom. But on a new version I would also have to update each pom to the new version...
I would be very grateful if someone could share his or her experience and opinion with me to achieve the best maintainable solution in this case.
Thanks in advance!

Jenkins Multibranch Pipeline can't find Jenkinsfile in subdirectory using svn

I'm trying to set up a build using Multibranch. I'm basically having the same problem as stated here, but our SCM is Subversion. The Bug in the Bitbucket Branch Source Plugin as described here can therefore be ruled out, especially since our Jenkins has the newest version installed anyway.
I tried to find a similar ticket regarding my problem, but couldn't find one, so here I am.
As this particular project is configured in a way that configuration files (including something like the Jenkinsfile) are to be stored in a subfolder, I don't know what else to try, apart from configuring individual jobs. I'd rather stick to using Multipipelines, however, as they help keeping the build jobs tidy.

Does Jenkins support incremental pipeline builds?

I have been searching far and wide to see if I can find information on Jenkins incremental pipeline builds that does not involve Maven.
The general idea is that I want to build a generic project and run specific steps of the pipeline if the underlying code has changed. If the code did not change, I want to re-use the results from a previous build.
The reason why I want to do this, is to drastically reduce build times for huge projects.
Imagine that you only need to fix 1 line in a SCSS file, but the whole project needs to be rebuild, repackaged, etc because of this. In the meantime, the site is live and broken and waiting 15 mins to be fixed.
Can someone give a basic example of how such a build can be created or where I can find more information on incremental building?
The only thing I have been able to find is incremental building for Maven projects, but this is not applicable for me.
The standard solution is to create modules that depends on each others.
Publish the built artifact of your modules to a binary repository like Sonatype Nexus (you can easily create private npm repo as well as proxy npm repo).
During the build download the dependencies, instead of building them.
If this solution is not the one you want to take, you will have a hard time hacking a solution. To persist the state of your steps, an easy solution is to create files in the job workspace and read them at next build

Phabricator + Jenkins: Any ideas of how to get the whole workspace when executing an arc diff?

I'm using the plugin from Jenkins for the integration with Phabricator. Everything works perfectly during the integration, but when an arc diff is executed, the only files received on the Jenkins job is the files edited or created by the arc diff. This is a problem when creating microservices and editing files that could potentially affect to other microservices.
Any thoughts on how to get the whole workspace from the repository (hosted also in Phabricator) to be tested, instead of only the diff files?
I've created an issue on the plugin, here is the link:
https://github.com/uber/phabricator-jenkins-plugin/issues/334
Using git plugin and the repository URL, has been quite straightforward: added the URL of the staging area (in my case is the same as the repository) and git credentials (as username/password). The only trick is to indicate which tag we need to test. Since phabricator staging areas uses two tags per diff (phabricator/base/${DIFF_ID} with the base code of the diff and phabricator/diff/${DIFF_ID} with the whole code with the diff applied), i used the following setting:
As result, git plugin will build the code of the whole project with the diff applied.
More information about the integration and the needed variables can be found here:
https://github.com/uber/phabricator-jenkins-plugin

How to use one .travis.yml file for multiple repositories

I have several separate github repositories in a github organization for which I want to run the same build test with travis-ci.
That is, I want to be able to use the same .travis.yml for all of these repositories. Moreover, I'd like to be able to update this file and have those changes be valid for each repository.
I could copy the .travis.yml into each repository. But if I have hundred or two hundred repositories, that gets annoying real fast.
Is there anyway to simply point each repository to an external .travis.yml rather than having to put a duplicate .travis.yml file in each repository.
There isn't a way to do this with a remote .travis.yml file, as Travis-CI will look at the root of the project for this file. An alternative approach I would suggest to accomplish your goal:
Build automation around updating all of your repository's .travis.yml files from a shared common file. Using your favorite scripting language, updating the file in all specified repositories and then pushed to GitHub/GitLab automatically. This should help in maintenance of your repositories with just a bit of extra automated work.

Resources