How to add a script file in Jenkins job workspace remotely and run it as part of build step - jenkins

Is it possible to add a file to Jenkins Job workspace and run it from build step.
The Jenkins is on a remote folder and I cannot directly access workspace as a folder structure.
Is there any way to achieve this from Jenkins dashboard?

Yes you can do that. In order to achieve that you can place your file in Git repository and then in the Jenkins job you can pull it and then you can execute it as a part of Jenkins job

Related

Change working directory during Jenkins build (not gradle, not maven)

Can I change the working directory during a Jenkins job for all successive steps?
My job's first step checks out a git project. Unfortunately this project has a mix of technologies; it's not a java/maven project (so the trick of 'mvn -f subdir/pom.xml' doesn't apply) nor is it a gradle project. So I'd like to change to a subdirectory of the checked-out project and start running Jenkins plugins, like invoking shell scripts, like running tox to test python code, like running docker to build images, etc.
Maybe Jenkins wants every step to begin in $WORKSPACE, and allowing a directory change during the job would break some vital assumptions?
I know this has been asked before. Similar questions but answers specific to maven:
Jenkins Maven Build -> Change Directory and
Jenkins: How To Build multiple top-level projects from a git repository?
Similar question but answer specific to gradle:
Change directory during a build job on jenkins
You can separate out job based on sub folders and use filter to checkout in you SCM configuration so only sub folder that you want for that job will get cloned in your workspace. As your first step of your build use batch/shell command to move all file from sub folder to workspace. and then run all the steps that you want.

send war to another job in jenkins

I've built two jobs in my jenkins instance:
Gradle job builds war task and generates a war file ready to be deployed.
Docker job builds a Docker image from a repository.
Both are working fine. However, the second one depends on the first one. So, Docker job needs to use the last war file generated by Gradle job.
How could I be able to do that?
You can use parameterized job trigger plugin to trigger sub jobs with parameter. You have to do the following to resolve above problem:
Create parent job which will have two sub jobs which you mentioned.
Then trigger first job and archive the artifacts which is war file.
Then pass last trigger build number of first job to second job and start that job. Use Copy artifacts plugin to copy war from first job with specific build number which was passed before starting.
This will resolve your problem!!!
Use a post build task for your first job (Gradle job), just cp the war file into the workspace of Docker job. Then configure Docker job so that it does NOT clean workspace before build. And for the post build action, choose delete workspace after build. This will ensure you will only have the latest war file in Gradle job workspace. Also, you should use post build trigger, if your not using that already.
Good Luck!

How to copy the artifacts from slave to Jenkins workspace?

I'm running a jenkins job on a slave and i want to store the generated artifacts in the server.Since the job is currently running on the slave the artifacts are also created there.
I tried using post build actions --->archive the artifacts.But it throws the following build error
ERROR: No artifacts found that match the file pattern "**/*.gz". Configuration error?
ERROR: '**/*.gz' doesn't match anything: '**' exists but not '**/*.gz'
Any help in this regards is highly appreciated.
Sounds like Copy To Slave Plugin is what you need
It can copy to slave (before build) and from slave (after build)
Copy files back to master node:
To activate this plugin for a given job, simply check the Copy files back to the job's workspace on the master node checkbox in the Post-build Actions section of the job. You then get the same two fields as for the Copy files to slave node before building section (note that label in the screenshot is old):
if you want to copy artifacts from JobA to the workspace of some other Job, you can do it using the Copy Artifact Plugin which is very simple to understand.
In case you just want to archive the artifacts already in JobA, then you are already in this direction and need to check what you are missing... are you sure that the artifacts are in the current workspace?
Doron

Execute Shell script from workspace on remote machine after build successful (Jenkins)

The scenario is - I have a job A which runs my ant script and packages the artifact's for me.
I am also using parametrized Triggered plug in to Trigger my "Job B" which will deploy my artifact on remote machine.
The job A is working fine and also Job B.
The tasks that i have to perform with Job B are
GIT checkout (which contains my deployment scripts) (successfully doning).
Copying artifacts from previous build to Remote machine. (successfully doing)
Run shell script on remote machine(script present in workspace folder )- Facing issues.
I browsed various plug ins for the same but no one is allowing me to run shell script after , "SCP to remote machine" which is present in Post build action.
I would like to execute the same sequence, however if you guys have any other suggestions please share.
Thanks in Advance.!
As part of Publish Over SSH Plugin, you can execute a script after the files had been copied over.
Under Post-build Actions
Add Send build artifacts over SSH
Select a preconfigured server (done in global configuration)
Select files to copy from workspace
Enter Exec command
If one of the files you copy is your shell script, you can enter it here as an "exec command"
To solve my query i used Jenkins SSH Plugin. This provides a configuration tab where i can add multiple hosts and after that used them in my job level configuration.
Link to Plugin
you get privilege to execute shell script on remote host as pre-build step or post build step.
updated the path of publish over ssh it worked for me

How to copy build XMLs from Master to the Slave node in Jenkins?

I have created a Jenkins job which used to run in the master and I have the build.xml file in the master.
Now I have added a slave node and added the setting Restrict where this project can be run so that my job always runs on a particular slave.
Now my build jobs are failing and I can see:
[EnvInject] - Loading node environment variables.
Building remotely on demo_slave_inst2 (slave1) in workspace /root/slave/workspace/demo_job
FATAL: Unable to find build script at /root/slave/workspace/demo_job/autobvt.xml
Build step 'Invoke Ant' marked build as failure
Recording test results
Finished: FAILURE
This autobvt.xml file already exists in the master. So looks like I need to copy this file over to the slave node manually which does not looks like quite handy.
How I can instruct jenkins to copy this as part of the build?
Use "Copy data to Workspace" http://wiki.jenkins-ci.org/display/JENKINS/Copy+Data+To+Workspace+Plugin using which you can copy the files from master to slave and run them as a part of build process (No manual effort needed!)
I sorted the issue using the Copy to Slave plugin.

Resources