I am using the Jenkins Matrix plugin to create three different kinds of builds (dev,release,manufacturing). Each type of build has its own artifactory repository.
Is there a way to configure the Jenkins Artifactory plugin to deploy the artifacts to a different repository depending on a condition? (ie. the type of matrix build).
Currently I only see the option to deploy to a single repository.
My project is a generic project that builds a tar.gz file using a groovy script.
This would seem to be a shortcoming of the plugin HAP-568
Currently there is only the ability to deploy to a single Artifactory
repository from a Jenkins CI job. It would be nice to have the ability
to deploy separate artifacts to multiple repositories from the same
Jenkins CI job. It would be even better if these multiple repos don't
need to be on the same server, but that would only be a nice-to-have.
An example would be a Jenkins CI job that builds souce that outputs
both debian, yum, and pypi. It needs to deploy each to separate
Artifactory repositories.
As a work around you can push to artifactory with curl
curl -u <user>:<password> -s -X PUT \
--data-binary #<file> http://<artifactory-server>/<directory>/<file>
Related
I need to setup a build configuration in Jenkins so that whenever a build is triggered, I get my latest scripts from Gitlab and copy them to the target systems and run that script on the target.
I couldn't find any relevant info for integrating Gitlab to Jenkins. Are there any specific plugins that I could use?
I am using Jenkins version 2.158
Step by Step procedure for doing what you are looking for:
Add the location of the Script from GitLAB. (E.g.)
Run the script over the target machine.
While Building the job, you will get the code at the root (./) of the job's workspace. Copying and running the script over the target machine can be done by remote script executions. following are the cases we having in running script in the remote machine
Windows (jenkins) to windows - use psexec.exe
Windows (Jenkins) to linux - use plink.exe which is command line putty
Linux (Jenkins) to linux - use SCP and SSH
Linux to Windows - use ansible for windows.
E.g,.
$ scp script.sh remote_username#10.10.0.2:/remote/directory
$ ssh -t remote_username#10.10.0.2 /remote/directory/script.sh
All the best.
Integration between Git Repository Management (github, gitlab,bitbucket, etc) and Jenkins has the following steps :
Developer push some source code (java, php, nodejs, etc) to the Git Repository Management.
The Git Repository Management detects this event and notify to some public http endpoint in your Jenkins. Currently webhook is the most recommended way to implement this notification.
Jenkins receive the http post request(from bitbucket for example) and using some plugin or configurations , Jenkins try to determine or get the basic devops parameters like : branch name, commit author, commit message, technology, etc
Whit the extracted devops parameters, Jenkins launchs a preconfigured job. This job use the previously extracted values to build, compile, zip, install or to do whatever is necessary to startup your application.
If you want to implement this flow, check this post:
https://jrichardsz.github.io/devops/devops-with-git-and-jenkins-using-webhooks
Also, if you need. I will gladly to show you a basic integration using some git repository management and jenkins . Just contact me.
I am trying to implement CI/CD pipeline for my microservice oriented project by using Kubernetes and Jenkins. I am using my code repository on my on-premise server. I created one SVN repository on my server.
I am interested to know, can I use my private SVN code repository with Jenkins?
The reason for my doubt is because every example is showing the creation of pipeline with Jenkins and GitHub project.
You can use the shell command in your pipeline. So you are free to use SVN with Jenkins:
https://tortoisesvn.net/docs/nightly/TortoiseSVN_en/tsvn-cli-main.html
Some info there:
Run bash command on jenkins pipeline
I have a project in gitlab. The project gets built for every check-in in the repo and build artifacts are created when the gitlab pipeline is successful.
I want to get these build artifacts in my jenkins pipeline job. Is there any way to do that?
I couldn't find any plugin in Jenkins to do this.
Any help is appreciated.
The GitLab API offers this with both complete artifacts package (zip) and single files. You need a GitLab token which you can add as a Credential (secret text) and the project number for the pipline you want to copy from. Project ID example of GanttLab Live.
curl --header "PRIVATE-TOKEN: <your_access_token>" "https://gitlab.example.com/api/v4/projects/1/jobs/artifacts/master/download?job=test"
We are building a java based high-availability service for a financial application. I am part of the team for managing continuous integration using Jenkins.
Lately we introduced continuous deployment too in the list and we opted for Docker containers.
Here is the the infrastructure:
The production cluster will have 3 RHEL machines running the following docker containers on each of them:
3 instances of Wildfly
Cassandra
Nginx
Application IDE is Netbeans and source code is in git.
Currently we are doing manual deployment on this infrastructure.
Please suggest me some tools which I use with Jenkins to complete the continuous deployment process.
You might want jenkins to trigger on each push to your jenkins repository. There are plugins that help you do that with a webhook.Gitlab-plugin is a solution similar solution exist for Github and other git solutions.
Instead of heavily relying on bash and jenkins configuration you might want to setup a jenkins pipeline with the jenkins pipeline plugin or even pipeline: multibranch plugin. With those you can automate your build in groovy code (jenkinsfile) in a repository with the possibility to add functunality with other plugins building on them.
You can then use the docker pipeline plugin to easily build docker containers, push docker images and run code inside docker containers.
I would suggest building your services inside docker so that your jenkins machine does not have all the different dependencies installed (and therefore maybe conflicting versions). Use docker containers with all the dependencies and run your build code in there with the docker pipeline plugin from groovy.
Install a registry solution to push and pull your docker images to.
Use the Pipeline: Shared Groovy Libraries to extract libraries from your jenkinsfiles so that they can be reused. Those library files should have their own repository which your jenkins knows about and keeps up to date. Possibly you can even have an entire pipeline process shared between multiple projects which simply add parameters in their jenkinsfile.
A lot of text and no examples. If you think something is interesting and you want to see some code just ask. I am currently setting all this up.
So I got few separated jobs in Jenkins. The first one gets the project from a Git repository, builds it and produces artifacts. And another one has to copy certificates from the first job and publish them to Artifactory (tried to make it using the Artifactory plugin). But the thing is that the Artifactory plugin's available only in the Build job, there's nothing like "Generic-Artifactory integration" in second job's configuration.
Does anyone know what are the requirements for making the plugin work in the Publish job?
You can write a small shell script leveraging Artifactory REST API and execute it in your second, non-build job.
I have done a similar thing with maven and a zip file. I have deployed a zip with a build step in maven calling a deploy:deploy-file and setting my Artifactory repository in settings.xml and deploying directly on my artifactory repository.