I am trying to script a groovy script which copies a complete folder with all subfolder and jobs to the actual folder, where the script is executed.
Here you can see how my folderstructure looks like.
--> Templ
|-->Folder
|-->Folder
|-->Subfolder
|-->Subsubfolder
|-->Subfolder
|-->Folder
-->Execution 2020
|-->Copyscript
I tried with different Plug-Ins like Jobcopy Builder.
Finally I tried with groovy scrips but nothing seems to work.
the simplest way to use AntBuilder
def ant = new AntBuilder()
ant.copy(todir: myDir) {
fileset(dir: "src/test") {
include(name: "**/*.java")
}
}
example taken from here
http://docs.groovy-lang.org/latest/html/documentation/ant-builder.html
to see all parameters of ant copy command see documentation:
https://ant.apache.org/manual/Tasks/copy.html
Related
I am running a Jenkinsfile that needs to include a step, where it goes in to an existing file in a Github repo and updates a line with a version number variable, that is stored a Jenkins credential called ${version}. So that when the build is done, the variable version is added in to the compilede go program.
e.g
The Jenkins Step will need to go in to the version.go file, go to the fmt.Println() line and add in ${version}.
stage('Update Version number in version.go') {
sh 'echo "${version}" > Version.go'
}
File and line to be updated. Example of what I need the println line to look like.
Version.go
var versionCmd = &cobra.Command{
Use: "version",
Short: "Print the version number of CLI",
Run: func(cmd *cobra.Command, args []string) {
fmt.Println( **${version}** )
},
}
I've been able to test adding ${version} to an existing file. My main confusion is how to specify a location in an existing file.
Thanks
I am preparing a Jenkins pipeline script in Groovy language. I would like to move all files and folders to another location. As Groovy supports Java so I used below java code to perform the operation.
pipeline{
agent any
stages{
stage('Organise Files'){
steps{
script{
File sourceFolder = new File("C:\\My-Source");
File destinationFolder = new File("C:\\My-Destination");
File[] listOfFiles = sourceFolder.listFiles();
echo "Files Total: " + listOfFiles.length;
for (File file : listOfFiles) {
if (file.isFile()) {
echo file.getName()
Files.copy(Paths.get(file.path), Paths.get("C:\\My-Destination"));
}
}
}
}
}
}
}
This code throws the bellow exception:
groovy.lang.MissingPropertyException: No such property: Files for
class: WorkflowScript
I tried with below code too, but it's not working either.
FileUtils.copyFile(file.path, "C:\\My-Destination");
Finally, I did try with java I/O Stream to perform the operation and the code is bellow:
def srcStream = new File("C:\\My-Source\\**\\*").newDataInputStream()
def dstStream = new File("C:\\My-Destination").newDataOutputStream()
dstStream << srcStream
srcStream.close()
dstStream.close()
But it's not working either and throws the below exception:
java.io.FileNotFoundException: C:\My-Source (Access is denied)
Can anyone suggest me how to solve the problem and please also let me know how can I delete the files from the source location after copy or move it? One more thing, during the copy can I filter some folder and files using wildcard? Please also let me know that.
Don't execute these I/O functions using plain Java/Groovy. Even if you get this running, this will always be executed on the master and not the build agents. Use pipeline steps also for this, for example:
bat("xcopy C:\\My-Source C:\\My-Destination /O /X /E /H /K")
or using the File Operations Plugin
fileOperations([fileCopyOperation(
excludes: '',
flattenFiles: false,
includes: 'C:\\My-Source\\**',
targetLocation: "C:\\My-Destination"
)]).
I assume I didn't hit the very right syntax for Windows paths here in my examples, but I hope you get the point.
I have a multi-project Gradle build working locally.
There is a parent gradle file, build.gradle
The settings.gradle file assigns the projects to their respective gradle build files:
include 'load'
include 'loadRemote'
project(':loadRemote').buildFileName = 'remoteLoad_build.gradle'
project(':load').buildFileName = 'load_build.gradle'
rootProject.name = 'EquipLoad'
The build.gradle parent file calls a buildAll command to build the 2 projects from the command line locally.
I created a Jenkins file to build both projects. But the Jenkins pipeline does not recognize the specific project tasks.
These are the tasks for the buildAll command
gradle.projectsEvaluated {
task compileAll (dependsOn: [project(':loadRemote').remoteLoadCleanCompileStage]) {
compileAll.finalizedBy project(':load').loadCleanCompileStage
}
task packageAll (dependsOn: [project(':loadRemote').remoteLoadPackage]) {
packageAll.finalizedBy project(':load').loadPackage
}
task buildAll (dependsOn: [compileAll]) {
buildAll.finalizedBy packageAll
}
}
The error in Jenkins is that it does not recognize the task project(':loadRemote').remoteLoadCleanCompileStage
How can I identify a multi-build project in Jenkins?
Do I have to add the settings.gradle file?
UPDATE
I thought that the different build files could not be located in the project so I added this to the settings.gradle file
rootProject.name = 'EquipLoad'
include 'load'
project(':load').projectDir = new File(settingsDir, rootProject.rootDir.getAbsolutePath() + "/Load")
project(':load').buildFileName = 'load_build.gradle'
include 'loadRemote'
project(':loadRemote').projectDir = new File(settingsDir, rootProject.rootDir.getAbsolutePath() + "/LoadRemote")
project(':loadRemote').buildFileName = 'remoteLoad_build.gradle'
The error is still the same, the build.gradle file (parent) does not recognize the dependency task project(':loadRemote').remoteLoadCleanCompileStage
Looking at the debug statements, the child build gradle files are found and identified:
Evaluating project ':loadRemote' using build file '/var/.../loadRemote/remoteLoad_build.gradle'.
The same text is shown for the load build file.
Yet the tasks within these gradle build files are not recognized in the parent build.gradle file.
The problem was a simple case sensitive mistake.
I named the folders: Load and LoadRemote. But identified them in the gradle scripts as ':load' and ':loadRemote'. By changing the script text to ':Load' and ':LoadRemote' fixed my problem.
We have just started using Gradle and do have a few (noob) questions - hopefully someone can shed some light on those issues :)
We're using Angular and Grails to build our web-app. We want to be as modular as possible and hence put all the Angular-related artifacts (mainly *.js and *.html files) in a separate project in our Gradle multiproject build.
Our project structure is as follows:
- root
-- build.gradle
-- settings.gradle
|-- web-grails (grails project)
|----- build.gradle
|-- web-js-html (angular / js / html sources)
|----- build.gradle
As a start, we simply want to package web-js-html project accordingly. What we've come up so far (other suggestions very welcome) is to apply a webjars structure to it, i.e. have a .jar file with the required resources under /META-INF/resources. Online, we found the following config that seems to work just fine:
// file :web-js-grails/build.gradle
apply plugin: 'java'
ext {
webjarconfig = [
staticHTMLFilesDir : "${projectDir}/src/main/webfrontend",
baseDir: "META-INF/resources/",
subDir : "webjars/" + deployed_app_name
]
}
configurations {
webjar
}
task webjar(type: Jar, dependsOn: 'jar') {
from(fileTree(webjarconfig.staticHTMLFilesDir)) {
into webjarconfig.baseDir + webjarconfig.subDir
}
outputs.file archivePath
}
artifacts {
webjar(webjar.archivePath) {
type 'jar'
builtBy webjar
}
}
By invoking 'gradle webjar', the jar gets created with the files in the correct place.
Question 1:
What I would have expected is that this jar also gets properly created if I invoke 'gradle build'. As far as I understand, 'gradle build' is a task defined by the java plugin which, at some point, invokes the 'jar' task. Once that 'jar' task is done, I would expect the webjar task to be invoked. But it's not, so clearly I'm missing something. Does it follow that webjar only ever gets executed if explicitly invoked (either from command-line or from within the build.gradle file)?
Now, we would like the webjar to be included in the web-grails war-file. The config of :web-grails/build.gradle is as follows:
apply plugin: "grails"
repositories {
mavenLocal()
maven { url artifactory_url }
}
buildscript {
repositories {
mavenLocal()
maven { url artifactory_url }
}
dependencies {
classpath 'org.grails:grails-gradle-plugin:2.0.1-SNB1'
}
}
grails {
grailsVersion = '2.3.8'
groovyVersion = '2.3.0'
}
dependencies {
bootstrap 'org.grails.plugins:tomcat:7.0.50'
compile project(':web-js-html')
}
After try-and-error and quite a bit of reading, I arrived at this (possibly wrong) conclusion: when I invoke 'gradle build' on :web-grails, then (I assume) :build will also be invoked on the referenced :web-js-html project. I say this because the jar gets re-created in the build/lib folder, but obviously not using the webjar-task. Hence, the resulting jar only contains the MANIFEST.MF only.
Question 2:
Do I use Gradle correctly in that case and am I only overseeing a little thing or is this whole approach questionable? How can I get the :web-js-html jar into the war properly?
Thank you for your help in advance!
Your part where you define the new artifact doesn't make any sense for me. Change
artifacts {
webjar(webjar.archivePath) {
type 'jar'
builtBy webjar
}
}
to
artifacts {
webjar webjar
}
Maybe you should rename either your configuration or your task. However the first webjar is your configuration and the second one your task which creates the new jar.
Note that this will create a new artifact, so you have to give it a different name with
task webjar(type: Jar, dependsOn: 'jar') {
baseName = 'newJar'
from(fileTree(webjarconfig.staticHTMLFilesDir)) {
into webjarconfig.baseDir + webjarconfig.subDir
}
outputs.file archivePath
}
But I think you don't want to create a second jar, but change the original one. In that case your don't have to write a new task, but configure the default jar task like this:
jar {
from(fileTree(webjarconfig.staticHTMLFilesDir)) {
into webjarconfig.baseDir + webjarconfig.subDir
}
outputs.file archivePath
}
I would like to execute a pre-build (grails) script from jenkins to replace a file in the plugins directory with a file in my SCM.
#!/bin/bash
PLUGINS_ORIG_DIR="plugins"
PLUGINS_DEST_DIR="/home/<my_user_name>/.grails/2.1.1/projects/judo/plugins"
cp -r $PLUGINS_ORIG_DIR/lang-selector-0.3/* $PLUGINS_DEST_DIR
But the script fails because the $PLUGINS_DEST_DIR cannot be found. Which should be the path or which is the best way to accomplish this?
Thank you.
[EDIT]
I have also tried to create an pre-war event, but it does not work either:
/**
* Copy modified resources to plugins directory, before packing the WAR
*/
eventCreateWarStart = { warName, stagingDir ->
def buildSettings = BuildSettingsHolder.getSettings()
def projectPluginsDir = buildSettings.getProperty("projectPluginsDir")
def baseDir = buildSettings.getProperty("baseDir")
ant.copy(todir:"${projectPluginsDir}/lang-selector-0.3", overwrite:true) {
fileset(dir:"${basedir}/plugins/lang-selector-0.3", includes:"**")
}
ant.copy(todir:"${projectPluginsDir}/jquery-datatables-1.7.5", overwrite:true) {
fileset(dir:"${basedir}/plugins/jquery-datatables-1.7.5", includes:"**")
}
}
as did you set your cloudbees account name ?
then you're wrong, should use /home/jenkins or just $HOME, as builds run on general purpose slaves as "jenkins" user
I have solved it by copying files to ${stagingDir}, instead of ${projectPluginsDir}