I want to publish artifacts to multiple scp repository from Jenkins. Currently there is only one "publish artifacts to SCP repository" option under the post-build actions section in Jenkins. The plugin gives 1-1 option to publish artifacts to remote location.
Use "Execute Shell" option and use SCP command to copy to multiple servers. That is the only options, Otherwise, configure one more job and call that job after this job is successful.
There's a post build action in Jenkins called Send artifacts over SSH, it's basically used to publish build artifacts but you can use it for your purpose.
Look at the bottom fo the screenshot, there's a button for adding another server where you can publish artifacts, serving the purpose!
P.S: You should do a little bit of hands on / search online for such solutions before publishing them to stackoverflow.
Related
We are searching for a CI and CD Solution for our WebApp based on NodeJS/Meteor.
Our Process should be:
On each Push to Master/ Pull Request/ Merge to Master do the following:
Checkout
Run Code Style Checks (coffeelint, scsslinter, etc.)
Build Code
Run Tests
Generate Tarball-Archive
Deploy archive to Developmet (Quality Management) Server, extract and run
next step would be manual testing of the app on our dev server.
when we think it is deployable, I want have a button in jenkins like "Deploy these Artifacts NOW to Live-Instance". How can I achive this? Also Nice would be something like deploy these artifacts at 2am to the live instances.
From Checkout to deploy to dev-server is already implemented in jenkins. What we need now is a button "deploy this artifact to live"
You need another job to get this working. The other job takes the artifact from the build job and deploy it wherever you want.
There is no possibility to include such behavior in the same job.
I have an project to upgrade my jenkins. So the problem is I use jenkins plugins called "Publish over ssh". As I read that plugin can publish our artifactory file to another server by ssh connection. But in the configuration it always run after our job finised. So every jobs that I run, always publish to server.
So, I want to ask, how can I pick which build history artifacts that I want to publish? and it can publish when I want to publish. And then I have an Idea to adding button in jenkins web interface, to trigger adding job configuration, build it, and publish it over SSH. But, I don't know how to modify jenkins web interface. My server use tomcat, that run java ".war" file.
So, do you all have suggestion for my problem, how to modify jenkins web interface, or how to pick certain build artifacts?
Thanks...
I would like to share byproducts of one jenkins job, with another one that run after.
I am aware that I can set "use custom workspace", but that would merge the jobs together; which is not what I want. I just need to move few files in a location, that are read by the next job.
So far I can't find out how you actually tell Jenkins jobs to look for a specific folder; since it does not have a concept of file system, beyond what is going on in the job workspace folder.
Is there a way to access the host file system, or declare a shared folder inside jenkins (like in the main workspace folder, which contains all the other jobs?), so I can copy and read files in it, from different jobs?
Where possible I would like to avoid plugins and extras; I would like to use what is included with Jenkins base.
I realize you want to avoid plugins, but the Jenkins-y way to accomplish this is to use the Copy Artifacts plugin, which does exactly what you want.
There are a variety of problems that you may run into when trying to manage the filesystem yourself. (How do you publish to a common location when running on different build nodes? How do you handle unsuccessful builds?) This solution uses Jenkins to track builds and artifacts. In the absence of a separate artifact repository, its a lot better than trying to manage it yourself.
To use Copy Artifacts:
As a Post-Build step, choose "Archive Artifacts" in the first job and enter the path(s) to the generated files.
Then in the second job, add a "Copy Artifacts from another project" build step to grab some or all files marked as artifacts in your first job. (By default, Jenkins will re-create the paths of the generated files in the second job's workspace, which may or may not be what you want, but you can change this behavior.)
Configure the Jenkins to run a Maven build, and deploy your artifacts with "mvn clean deploy" This will push it to an "artifact server" which you probably have, or if not, need to add / configure.
Then in your downstream job, also a Maven job, you configure it to depend on the same artifact that was published in the upstream job. This will trigger a download of the artifact from the artifact server and make it available to the build.
Is it possible to iterate on slaves with a particular label in a jenkins job?
For example, lets say I have few slave with label "redhat". I have a job in which I want to logically do something like:
for slave in slave_list_with_label_redhat do
ssh someuser#${slave.hostname}
done
Thanks in advance!!
Edit use case in detail:
So this is to workaround a bug in jenkins where archiving artifacts fails from AIX slave. https://issues.jenkins-ci.org/browse/JENKINS-13614
So what we really want to do is once the build is complete on build slave, we would like to scp "build files" to available jenkins aix slaves and install and run few tests on test slaves.
You might find https://wiki.jenkins-ci.org/display/JENKINS/Elastic+Axis to fit your needs
I think this is used as part of the multi-configuration (Axis) create job and you can then select this as one of the axes
I usually let my jobs download all necessary artifacts. So your option would be to add another build step to your build job and archive the artifacts manually outside of Jenkins. You can use SCM tools (not perfect for binary artifacts but offer great auditing features) or binary artifact repositories (e.g. nexus - best, but I have no Idea about the auditing features) or just a remote filesystem (you can mount it locally on your slave machines). The build job just needs to pass some information to the test jobs so that these can get the right artifact. This way, Jenkins still decides where the test job(s) will run but the slaves have always access to the artifacts.
So I got few separated jobs in Jenkins. The first one gets the project from a Git repository, builds it and produces artifacts. And another one has to copy certificates from the first job and publish them to Artifactory (tried to make it using the Artifactory plugin). But the thing is that the Artifactory plugin's available only in the Build job, there's nothing like "Generic-Artifactory integration" in second job's configuration.
Does anyone know what are the requirements for making the plugin work in the Publish job?
You can write a small shell script leveraging Artifactory REST API and execute it in your second, non-build job.
I have done a similar thing with maven and a zip file. I have deployed a zip with a build step in maven calling a deploy:deploy-file and setting my Artifactory repository in settings.xml and deploying directly on my artifactory repository.