How to build a jenkins pipeline choosing 4 path options to copy .war files to a selected tomcat's webapps folder? - jenkins

I am new on jenkins and I have this situation here. Devs used to deploy by themselves .war files to various paths using a type of manual versioning.
I put my server with NFS share with those repository of files and I would like to build a pipeline choosing different paths to deploy files in my tomcat server.
If a Dev do a new version I just update this new path. But I need to do it selecting pre inserted paths as a choice. I need to deploy those new files selecting paths in a parameterized pipeline.
So I can do deploys just choosing a selection of paths to same tomcat server.
It is hard to start it. I configured my Jenkins docker container and it can execute remote test jobs well.
I need to write this type of pipeline and I really got some problems to join all pieces together.
Or if it is possible I just pass the NFS absolute path to this job as a parameter when I start to execute this pipeline to make it very flexible.
How can I start? I installed some plugins like File Operations and Send Files and execute ssh commands over SSH but it is hard to build it.
Any help or way or links to study it is appreciated.

Related

How to attach build log files to Jenkins?

I'm building Jenkins pipeline and after pipeline fails with server installation some logs are generated on machine where server is being installed.
I want to attach those logs to Jenkins build so person can see that file from Jenkins build only instead of going to machine and find it out.
I saw a plugin Copy To Slave Plugin but for installation when I searched for it in Jenkins, it's not listed.
Could you please suggest which plugin will help me to attach log files to Jenkins build?
Due to the complex nature of filesystems, Jenkins is not capable to copy logs from extraneous locations like those outside of the Jenkins root directory. This is for security reasons, which is why the Copy to Slave Plugin you referred to earlier has been discontinued.
In short, Jenkins spawns processes that spawn other processes that are owned by different users in the filesystem (e.g. root). For this reason, it is highly probable that the log files you are referring to are located elsewhere on the file system (i.e. not in $JENKINS_HOME), and thus are not owned by the jenkins user.
It is possible to use cat or tail on the log files in the Jenkins build itself. In combination with a plugin like Log Parser, this can provide some nice output in another screen.
I would be interested about what do you mean by “install”? Can the install happen during the building of a Docker image? Or in a pre-built Docker container? If this is the case, you can copy the “installed” files to the destination.
This would help you, because any log files created during the “install” can be copied out from the docker container and attached to the Jenkins build as an archived artifact.
For this, you don’t even need a plug-in.

Export file generated by a project

I have a Java project that is running as a job on a Jenkins server. The project is generating a file that is being stored locally on the server in the respective project folder. Currently, to get hold of this file, I am logging into the Jenkins server and get it manually. I would like to make this file available for download directly through the Jenkins job somehow. Not sure if there is a way to do that though. Is there a plug in that might add this functionality or is there any other way to achieve that?
Archive the file. For a freestyle job, that would be a "Post Build Step". For a pipeline job, use archiveArtifacts.
You definitely don't want to rely it on it being in a workspace on the Jenkins agent machine somewhere. The workspace directories can move and change, or be removed.

Jenkins ArtifactDeployer simple creates a new dir in base dir?

I am in Jenkins and using ArtifactDeployer. The console output tells me that the remote repo is http:// myrepo but all it does is create a new folder in my base directory which I also specify in this plugin. It correctly finds only one file to copy but strangely just creates a new directory and copies it in there. I thought this would enable me to deploy artifacts to another server... Can I do that?
No you can not do that with artifacts deployer but there are other ones you can use - read below:
Jenkins provides by default a feature for archiving artifacts generated by the build of a Jenkins Job. These artifacts are archived in the JENKINS_HOME directory. However, this directory contains also tool configurations (global and job configurations). Therefore, there is no separation between infrastructure data, jobs data and generated elements. It is often considered to be a bad practice and it doesn't help to manage it from an administrator point of view.
Unfortunately, it's not possible to extend the 'archived artifacts' feature to archive artifacts in a location other than JENKINS_HOME.
The main goal of the ArtifactDeployer plugin is to archive artifacts in your desired locations (other than JENKINS_HOME directory).
There are many Jenkins plugins close to ArtifactDeployer such as CopyArtifact plugin or CopyArchiver plugin for publishing artifacts from Jenkins resources (from the current workspace, from the old builds of the same job or other jobs, ...) to remote locations with the protocol file://
There are also others plugins for managing other protocols such as ftp://, ssh:///.

Plugin to Copy Files from Desktop to Jobs Workspace in Jenkins

I have a scenario wherein the users who have created some configuration files need to upload the same from the desktop where they access Jenkins onto the Job's workspace to build and execute tests.
So I did try using the Config File Provider plugin as mentioned in https://wiki.jenkins-ci.org/display/JENKINS/Config+File+Provider+Plugin. This seems to work fine to copy the configuration file to the Jenkins UI, which can be later synced to the Slave with the given path in the Build environment of the respective job. But the users who wish to upload these files don't have the administrative rights. Hence they are unable to access the Configuration File Management which is under the Manage Jenkins tab. Is there any way that I can move the Configuration File Management under the Jenkins sidebar and allow users to edit the same.
Are there any other plugins that will help me achieve the same? I did also try Copy to Slave plugin but this only copies the files under $JENKINS_HOME/userContent to the job's workspace. We will have to copy the files from the desktop to $JENKINS_HOME/userContent and then use the plugin.
Wouldn't using a parameterized job and have one of the parameters be a File Parameter? See https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Build

Jenkins - Is it necessary to have a repository for running a multi configuration project

I am new to Jenkins. I am trying to configure Jenkins to run my Selenium tests on multiple browsers. So I thought multi-configuration project would be a best choice. I am using Sauce labs for performing cross-browser testing.
My selenium source code is on my local system directory. I have not uploaded the same to any of the repositories. I have configured a multi-configuration project with a custom workspace pointing to my local source code, and selected "none" in Source code management section.
Now, when I build the job, the job creates workspace for each browser combination. Eg: <project workspace>\SELENIUM_DRIVER\Windows 2003firefox16 and <project workspace>\SELENIUM_DRIVER\Windows 2003internet explorer8. But the files are not copied to each of these workspaces automatically. I need to copy my files manually into these directories for it to work.
Is it necessary to have Repositories like SVN, CVS or Git for this to work. Or is there a way I can run the build from my local system?
For this to work, a Repository is not required,
but you do need to have some good way to access your artifacts and selenium code.
Suggest you drop the artifacts on a shared drive as a preliminary step,
and also put your selenium source-code on a shared drive, as a practice -
this will allow you to run multiple tests from multiple machines.
Cheers

Resources