I have a Jenkins job that invoke a gradle script to create a .war file from sources.
gradle war command produces a file with name Geo-1.0.5.war because build.gradle use version number:
war {
baseName = 'Geo'
version = '1.0.5'
}
This file will be copied and deployed on a Wildfly server trough SSH using "Publish Over SSH Plugin".
How can I tell to the plugin that the war filename format is something like Geo-$gradle_version.war?
This is documented if you click the (?) help icon next to the "Source files" field within Jenkins:
The string is a comma separated list of includes for an Ant fileset eg. **/*.jar
(see Patterns in the Ant manual).
So in your case, you could use **/Geo-*.war as the source pattern.
This is also shown in the screenshot on the plugin wiki page, and in the Source Files and Examples sections on the linked "Publish Over…" documentation.
In your comment to this answer, you mention that you don't want to communicate that the filename is "something like Geo-$gradle_version.war" for uploading, but rather want to use the exact filename in a script being executed on the SSH host.
You could do this by adding an Execute Shell step which determines the filename, and exporting it as an environment variable using the EnvInject Plugin. For example:
f=$(basename `find . -name 'Geo-*.war'`)
echo WAR_FILENAME=${f} > env.properties
Then, by using an Inject Environment Variables step with its path set to env.properties, the WAR_FILENAME value will be added to the build environment, available for use by subsequent steps.
In the Exec Command field of the SSH-publishing step, you can then use ${WAR_FILENAME} to refer to the exact filename uploaded.
Related
I want to deployto WebLogic using groovy code inside Jenkins job pipeline.
Has anyone ever used a groovy code inside Jenkins job pipeline to deploy to WebLogic application? WebLogic version is 10.x.
I know how to do it with freestyle job and it works via plugin, but when I click on pipeline syntax, I don't see nothing from this plugin.
I have googled and googled, and nothing actually works or is not the scope of my needs and too complex to understand so I could addapt (using Java etc).
SOLVED: OK so I found a way and a way to make it work. Basically one can write a python (jython) code which can manage WebLogic with its built-in WLST scripting mechanism.
But to make everything work, one needs to:
generate wlfullclient.jar on your WebLogic machine: https://docs.oracle.com/cd/E12839_01/web.1111/e13717/jarbuilder.htm#SACLT239
Use the following steps to create a wlfullclient.jar file for a JDK 1.6 client application:
Change directories to the server/lib directory.
cd WL_HOME/server/lib
Use the following command to create wlfullclient.jar in the server/lib directory:
java -jar wljarbuilder.jar
You can now copy and bundle the wlfullclient.jar with client applications.
Add the wlfullclient.jar to the client application's classpath.
in order for this to work from other machine, without installing WebLogic to it, one needs additional .jar files, which can be found on WebLogic machine in some Weblogic folder e.g. C:\bea10\wlserver_10.3....
copy dependent .jar files to desired machine, create empty props.txt file and call your python script like this (in the command you will note which .jar files are also needed in classpath -cp). Dweblogic.home is where weblogic.jar is located. Note that if you gonna put those jars in environment classpath variable, you can NOT add a path to folder, since .jar and .zip files need to be targeted directly.
java -cp C:\Users\icami\Desktop\weblogic\wlfullclient.jar;C:\Users\icami\Desktop\weblogic\com.bea.core.xml.xmlbeans_2.2.0.0.jar;C:\Users\icami\Desktop\weblogic\com.oracle.cie.comdev_6.4.0.0.jar;C:\Users\icami\Desktop\weblogic\com.oracle.cie.config-wls-schema_10.3.6.0.jar;C:\Users\icami\Desktop\weblogic\com.oracle.cie.config-wls_7.2.0.0.jar;C:\Users\icami\Desktop\weblogic\com.oracle.cie.config_7.2.0.0.jar;C:\Users\icami\Desktop\weblogic\com.oracle.cie.wizard_6.1.0.0.jar;C:\Users\icami\Desktop\weblogic\com.oracle.core.weblogic.msgcat_1.2.0.0.jar;C:\Users\icami\Desktop\weblogic\jython.jar;C:\Users\icami\Desktop\weblogic\weblogic.jar -Dprod.props.file=C:\Users\icami\Desktop\weblogic\props.txt -Dbea.home= -Dweblogic.home=C:\Users\icami\Desktop\weblogic weblogic.WLST test.py
Example of a safe test.py, it only retreives state, listen address etc, doesn't change anything, feel free to run it:
username = 'weblogic'
password = 'weblogic'
URL='t3://weblogic.domain.com:7001'
connect(username,password,URL)
domainRuntime()
cd('ServerRuntimes')
servers=domainRuntimeService.getServerRuntimes()
for server in servers:
serverName=server.getName();
print '**************************************************\n'
print '############## ', serverName, '###############'
print '**************************************************\n'
print '##### Server State #####', server.getState()
print '##### Server ListenAddress #####', server.getListenAddress()
print '##### Server ListenPort #####', server.getListenPort()
print '##### Server Health State #####', server.getHealthState()
I have 3 Jenkins jobs to be run in serial.
Run a Ant File
Run another ANT File
Run a command line
All the above jobs use a file path which is set in a properties file.
Ex Job 1 , Executes ANT file placed in file path location
Job 2 , Executes another file placed in same file path location
Job 3 , Executes command line to do SVN update in same file path location
I need to parameterize the file path in all three builds from properties file.
Can anyone help me with possible approach?
Thanks In Advance
This answer could be a little high level. You can use Jenkins Pipeline as a code for this approach instead of using 3 freestyle jobs.
You can create 3 stages which performs these 3 steps. Pipeline as a code supports reading of properties from different file types (json, yaml etc.)
Look for the "EnvInject" plugin. This lets you inject properties into your build as environment variables; these assignments survive build step boundaries.
If the property file is checked in, you can load it in the Build Environment section before the build steps start executing. If the property file is generated during the build sequence, you can add a build step between where the property file is created and where it is used.
Once set, if the property file contains "FOO=/path/to/folder" then in configuring Jenkins things you would refer to $FOO or ${FOO} (for example, an Ant build step might specify "${FOO}/build.xml"; in Windows batch script execution FOO shows up as an environment variable and is referenced by %FOO% (i.e., "#echo Some_Useful_Piece_Of_Data > %FOO%\data.txt"
More information can be found here: https://wiki.jenkins.io/display/JENKINS/EnvInject+Plugin
I know its possible to run a .dsl file from an external source instead of just writing the code of the flow in the job's description, but every time I try to run lets say:
/home/flows/flow_script.dsl
I get the following error:
java.io.FileNotFoundException:/home/flows/flow_script.dsl (No such file or directory)
The path is correct, I can see the file through that path from the shell, but it doesnt let me select anything outside the "builds workspace" apparetly.
I recently ran into this very issue: my DSL script was outside of my workspace (installed via a package). The problem is that the DSL Scripts path is an Ant format that only allows specific patterns (and not absolute paths).
My workaround is hacky, but it did work: add an Execute Shell step before the "Process Job DSLs" step that symlinks the external directory into the workspace.
Something like this:
echo "Creating a symlink from /home/flows to workspace"
ln -sf "/home/flows" .flows
Then you can set the DSL Scripts path to ".flows/flow_script.dsl".
This has some additional caveats, of course: the directory you're symlinking from will need to be accessible by the jenkins user. And it likely violates a lot of best practices.
I have configured a job in Jenkins and checked "This build is parameterized" option. The parameter name I have given is "My_Param". The Jenkins is installed in the server machine. So I access the Jenkins dashboard through http://<servername>:8080/ In the Build part, I have to call a script by opening cygwin. So I write
#!C:\cygwin\bin\bash --login -i
./build/myscript.sh -full
After the build is completed, I want to move the files to another new directory prefix with Output, This directory name is the parameter I intend to pass. so I write
mkdir /cygdrive/c/users/admin/Ouput$My_Param
I run the build and pass param as first
But, the directory is created as Output in the server machine and not as Outputfirst
Since you noted you use cygwin, I understand the server is on windows.
Try parameter windows style environment variable: %My_Param% or linux: ${My_Param}
I hope this helps.
I have set up a jenkins build and everything is working fine except the very last step.
The whole build creates a directory called: build
This directory contains a web-inf and all the files in it I would like to publish via SCP to a different location, so that all the content of the build/web-inf folder will become the content of the target folder.
The settings for jenkins scp plugin are (it is a post-build step):
source: build/web-inf/**
destination: public_html/
that results in:
public_html/build/web-inf/...
but should be:
public_html/...
(the keep hierarchy box is ticked)
how can I make that happen??
EDIT
I could solve the problem without any additional script. The solution is so simple that my question turned out to be stupid.
All I did was telling ant to copy all the webfiles to ./public_html instead of ./build/web-inf/ what made the jenkins scp copy all files from public_html to public_html exactly as it was intended to.
If your goal is just to SCP files generated during the build, and the plugin doesn't seem to be working (I couldn't see anything wrong in your configuration) you can use an "Execute shell" build step and type the scp command something like (try it in a shell first in your job's build directory to get the syntax right):
scp -r build/web-inf/* user#host:/destination-directory