Jenkins is skipping over processbuilder - jenkins

I have a cucumber test job validating the contents of a downloaded file.
I am using a processbuilder to call a bat file that's triggering a jarfile, when jenkins get to the try/catch that's holding my processbuilder it just skips it
My Call:
ProcessBuilder pb = new ProcessBuilder("src/Files/test.bat", appName,key,dir).inheritIO();
System.out.println("Report Saved");
pb.start().waitFor()

Related

Jenkins pipeline script to copy or move file to another destination

I am preparing a Jenkins pipeline script in Groovy language. I would like to move all files and folders to another location. As Groovy supports Java so I used below java code to perform the operation.
pipeline{
agent any
stages{
stage('Organise Files'){
steps{
script{
File sourceFolder = new File("C:\\My-Source");
File destinationFolder = new File("C:\\My-Destination");
File[] listOfFiles = sourceFolder.listFiles();
echo "Files Total: " + listOfFiles.length;
for (File file : listOfFiles) {
if (file.isFile()) {
echo file.getName()
Files.copy(Paths.get(file.path), Paths.get("C:\\My-Destination"));
}
}
}
}
}
}
}
This code throws the bellow exception:
groovy.lang.MissingPropertyException: No such property: Files for
class: WorkflowScript
I tried with below code too, but it's not working either.
FileUtils.copyFile(file.path, "C:\\My-Destination");
Finally, I did try with java I/O Stream to perform the operation and the code is bellow:
def srcStream = new File("C:\\My-Source\\**\\*").newDataInputStream()
def dstStream = new File("C:\\My-Destination").newDataOutputStream()
dstStream << srcStream
srcStream.close()
dstStream.close()
But it's not working either and throws the below exception:
java.io.FileNotFoundException: C:\My-Source (Access is denied)
Can anyone suggest me how to solve the problem and please also let me know how can I delete the files from the source location after copy or move it? One more thing, during the copy can I filter some folder and files using wildcard? Please also let me know that.
Don't execute these I/O functions using plain Java/Groovy. Even if you get this running, this will always be executed on the master and not the build agents. Use pipeline steps also for this, for example:
bat("xcopy C:\\My-Source C:\\My-Destination /O /X /E /H /K")
or using the File Operations Plugin
fileOperations([fileCopyOperation(
excludes: '',
flattenFiles: false,
includes: 'C:\\My-Source\\**',
targetLocation: "C:\\My-Destination"
)]).
I assume I didn't hit the very right syntax for Windows paths here in my examples, but I hope you get the point.

How to run MsBuild in Jenkins scrippted pipeline correctly?

I am using scrippted pipeline in Jenkins and I want to compile a solution using MsBuild.
Problem is when I run it using bat command: bat ' MsBuild.exe solution.sln /p:Configuration=Debug' (which runs it as a batch file) and when the build FAILS the job doesn't fail.
It's like it doesn't recognize that the MsBuild failed to compile the solution and continues to the next steps.
How can I run MsBuild and analize the output so that if the build fails, then the job will fail too?
Thank you
Try following and see how it goes:
def msbuild = "path/to/msbuild/MsBuild.exe"
def exitStatus = bat(returnStatus: true, script: "${msbuild} solution.sln /p:Configuration=Debug")
if (exitStatus != 0){
currentBuild.result = 'FAILURE'
}
And if you don't want to execute it any further, you can throw an error if exit status is not 0:
if (exitStatus != 0){
currentBuild.result = 'FAILURE'
error 'build failed'
}

Copy artifacts of multiple builds on same job in Jenkins

I'm using MultiJob plugin and have a job (Job-A) that triggers Job-B several times.
My requirement is to copy some artifact (xml files) from each build.
The difficulty I have is that using Copy Artifact Plugin with "last successful build" option will only take the last build of Job-B, while I need to copy from all builds that were triggered on the same build of Job-A
The flow looks like:
Job-A starts and triggers:
`Job-A` -->
Job-B build #1
Job-B build #2
Job-B build #3
** copy artifcats of all last 3 builds, not just #3 **
Note: Job-B could be executed on different slaves on the same run (I set the slave to run on dynamically by setting parameter on upstream job-A)
When all builds are completed, I want Job-A to copy artifact from build #1, #2 and #3 , and not just from last build.
How can I do this?
Here is more generic groovy script; it uses the groovy plugin and the copyArtifact plugin; see instructions in the code comments.
It simply copies artifacts from all downstream jobs into the upstream job's workspace.
If you call the same job several times, you could use the job number in the copyArtifact's 'target' parameter to keep the artifacts separate.
// This script copies artifacts from downstream jobs into the upstream job's workspace.
//
// To use, add a "Execute system groovy script" build step into the upstream job
// after the invocation of other projects/jobs, and specify
// "/var/lib/jenkins/groovy/copyArtifactsFromDownstream.groovy" as script.
import hudson.plugins.copyartifact.*
import hudson.model.AbstractBuild
import hudson.Launcher
import hudson.model.BuildListener
import hudson.FilePath
for (subBuild in build.builders) {
println(subBuild.jobName + " => " + subBuild.buildNumber)
copyTriggeredResults(subBuild.jobName, Integer.toString(subBuild.buildNumber))
}
// Inspired by http://kevinormbrek.blogspot.com/2013/11/using-copy-artifact-plugin-in-system.html
def copyTriggeredResults(projName, buildNumber) {
def selector = new SpecificBuildSelector(buildNumber)
// CopyArtifact(String projectName, String parameters, BuildSelector selector,
// String filter, String target, boolean flatten, boolean optional)
def copyArtifact = new CopyArtifact(projName, "", selector, "**", null, false, true)
// use reflection because direct call invokes deprecated method
// perform(Build<?, ?> build, Launcher launcher, BuildListener listener)
def perform = copyArtifact.class.getMethod("perform", AbstractBuild, Launcher, BuildListener)
perform.invoke(copyArtifact, build, launcher, listener)
}
I suggest you the following approach:
Use Execute System Groovy script from Groovy Plugin to execute the following script:
import hudson.model.*
// get upstream job
def jobName = build.getEnvironment(listener).get('JOB_NAME')
def job = Hudson.instance.getJob(jobName)
def upstreamJob = job.upstreamProjects.iterator().next()
// prepare build numbers
def n1 = upstreamJob.lastBuild.number
def n2 = n1 - 1
def n3 = n1 - 2
// set parameters
def pa = new ParametersAction([
new StringParameterValue("UP_BUILD_NUMBER1", n1.toString()),
new StringParameterValue("UP_BUILD_NUMBER2", n2.toString()),
new StringParameterValue("UP_BUILD_NUMBER3", n3.toString())
])
Thread.currentThread().executable.addAction(pa)
This script will create three environment variables which correspond to three last build numbers of upstream job.
Add three build steps Copy artifacts from upstream project to copy artifacts from last three builds of upstream project (use environment variables from script above to set build number):
Run build and checkout build log, you should have something like this:
Copied 2 artifacts from "A" build number 4
Copied 2 artifacts from "A" build number 3
Copied 1 artifact from "A" build number 2
Note: perhaps, script need to be adjusted to catch unusual cases like "upstream project has only two builds", "current job doesn't have upstream job", "current job has more than one upstream job" etc.
You can use the following example from an "Execute Shell" Build Step.
Please note it can be run only from the Jenkins Master machine and the job calling this step also triggered the MultiJob.
#--------------------------------------
# Copy Artifacts from MultiJob Project
#--------------------------------------
PROJECT_NAME="MY_MULTI_JOB"
ARTIFACT_PATH="archive/target"
TARGET_DIRECTORY="target"
mkdir -p $TARGET_DIRECTORY
runCount="TRIGGERED_BUILD_RUN_COUNT_${PROJECT_NAME}"
for ((i=1; i<=${!runCount} ;i++))
do
buildNumber="${PROJECT_NAME}_${i}_BUILD_NUMBER"
cp $JENKINS_HOME/jobs/$PROJECT_NAME/builds/${!buildNumber}/$ARTIFACT_PATH/* $TARGET_DIRECTORY
done
#--------------------------------------

Executing a gant script from a grails project

I have written my own gant script which works fine from the command line. Now I need to run this script from a grails project like this:
def outputMessage
try{
GroovyScriptEngine engine = new GroovyScriptEngine("/www", this.getClass().getClassLoader());
engine.run("scripts/MyOwnScript_.groovy", "param1 param2")
outputMessage = "<br> OK: Script run successfully"
}
catch (Exception e) {
outputMessage += "<br> ERROR: There has been running the script"
}
The error I am getting is "No such property: includeTargets for class: MyOwnScript_", as my gant script requires some other scripts.
Does anybody know a proper way to get it working?
Have you tried to get path to your script folder and execute an external process like
["groovy", "scripts/MyOwnScript_.groovy", "param1", "param2"].execute()
See here for more info about running external process in groovy
Answering my own question. The main problem was that I need to run the grails using the full path like this:
Map<String, String> env = System.getenv()
final processBuilder = new ProcessBuilder()
processBuilder.directory(new File("folderFromWhereIWantToRunTheGantScript"))
processBuilder.command([env['GRAILS_HOME']+"/bin/grails","MyOwnScript param1 param2"])
println processBuilder.directory()
Process proc = processBuilder.start()
proc.consumeProcessOutput(out, err)
proc.waitFor()

How to write connection script between Grails and Hadoop?

I need to copy the files which are generated within Grails to Hadoop dynamically. How will I write code for this in Grails? Whenever a file is generated it should be copied into Hadoop. If the incoming file already exists, it should get updated in Hadoop.
I used shell script to connect grails and hadoop.
I had all the commands to run hadoop jobs in myjob.sh (Workflow Script)
And i added the code to execute shell script in my controller
def scriptCom="/folderlocation/shellscript.sh"
println "[[Running $scriptCom]]"
def proc = scriptCom.execute()
def oneMinute = 60000
proc.waitForOrKill(oneMinute)
if(proc.exitValue()!=0){
println "[[return code: ${proc.exitValue()}]]"
println "[[stderr: ${proc.err.text}]]"
return null
}else{
println "[[stdout:$revisionid]]"
return proc.in.text.readLines()
}

Resources