How to create a Jenkins Job with parameters using ansible? - jenkins

I need a help in Jenkins Job Creation with parameters using Ansible. I have checked many documents but not helping much.
I have checked https://docs.ansible.com/ansible/latest/collections/community/general/jenkins_job_module.html but not getting what needs to be added in templates/test.xml?
- name: Create a jenkins job using basic authentication
community.general.jenkins_job:
config: "{{ lookup('file', 'templates/test.xml') }}"
name: test
password: admin
url: http://localhost:8080
user: admin

Here is an example of a simple Jenkins job configuration in XML format that you can use as a starting point:
<?xml version='1.0' encoding='UTF-8'?>
<project>
<description>Example Jenkins job with parameters</description>
<keepDependencies>false</keepDependencies>
<properties>
<hudson.model.ParametersDefinitionProperty>
<parameterDefinitions>
<hudson.model.StringParameterDefinition>
<name>param1</name>
<description>Example parameter</description>
<defaultValue>value1</defaultValue>
</hudson.model.StringParameterDefinition>
</parameterDefinitions>
</hudson.model.ParametersDefinitionProperty>
</properties>
<scm class="hudson.scm.NullSCM"/>
<canRoam>true</canRoam>
<disabled>false</disabled>
<blockBuildWhenDownstreamBuilding>false</blockBuildWhenDownstreamBuilding>
<blockBuildWhenUpstreamBuilding>false</blockBuildWhenUpstreamBuilding>
<triggers/>
<concurrentBuild>false</concurrentBuild>
<builders>
<hudson.tasks.Shell>
<command>echo "Hello, world!"</command>
</hudson.tasks.Shell>
</builders>
<publishers/>
<buildWrappers/>
</project>

Related

Jib - where to copy webapp folder inside image?

I am creating docker image using google's Jib maven plugin, image gets created successfully and backend services are working fine but my webapp folder is not part of that image. Before jib i was creating a zip containing everything (including webapp folder in the root of that zip along with executable jar) which was working fine.
Now the image created by jib has classes, libs, resources in the app root. How and where should i copy webapp folder ?
It worked for me by using external directories provided by maven jib plugin.
<extraDirectories>
<paths>
<path>webapp</path>
<path>
<from>webapp</from>
<into>/app/webapp</into>
</path>
</paths>
</extraDirectories>
I am currently using running a spring-boot version 2.4.10 and the application is packed as a jar.
My project have JSPs at:
src/main/webapp/WEB-INF/jsp
This is important because it allows me to run the application as an executable jar: java -jar ./target/app.jar --spring.profiles.active=prod
jib-plugin doesn't copy the src/main/webapp directory to the container image by default, so we need to add it manually by including the following configuration.
<extraDirectories>
<paths>
<path>
<from>src/main/webapp/WEB-INF</from>
<into>/app/resources/META-INF/resources/WEB-INF</into>
</path>
</paths>
</extraDirectories>
I provide jib-plugin a custom entrypoint.sh
The entrypoint.sh is located at src/main/jib
#!/bin/sh
echo "The application will start in ${APPLICATION_SLEEP}s..." && sleep ${APPLICATION_SLEEP}
exec java ${JAVA_OPTS} -noverify -XX:+AlwaysPreTouch \
-Djava.security.egd=file:/dev/./urandom -cp /app/resources/:/app/classes/:/app/libs/* \
"com.demo.application.Application" "$#"
My final jib-plugin configuration is the following:
<plugin>
<groupId>com.google.cloud.tools</groupId>
<artifactId>jib-maven-plugin</artifactId>
<version>${jib-maven-plugin.version}</version>
<configuration>
<from>

</from>
<to>

<tags>
<tag>latest</tag>
<tag>${project.version}</tag>
</tags>
</to>
<container>
<entrypoint>
<shell>bash</shell>
<option>-c</option>
<arg>/entrypoint.sh</arg>
</entrypoint>
<ports>
<port>8080</port>
</ports>
<environment>
<SPRING_OUTPUT_ANSI_ENABLED>ALWAYS</SPRING_OUTPUT_ANSI_ENABLED>
<APPLICATION_SLEEP>0</APPLICATION_SLEEP>
</environment>
<creationTime>USE_CURRENT_TIMESTAMP</creationTime>
</container>
<extraDirectories>
<paths>
<path>src/main/jib</path>
<path>
<from>src/main/webapp/WEB-INF</from>
<into>/app/resources/META-INF/resources/WEB-INF</into>
</path>
</paths>
<permissions>
<permission>
<file>/entrypoint.sh</file>
<mode>755</mode>
</permission>
</permissions>
</extraDirectories>
</configuration>
<!-- Make JIB plugin run during the packing life cycle -->
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>build</goal>
</goals>
</execution>
</executions>
</plugin>
The above didn't work for me but the below did.
<extraDirectories>
<paths>
<path>
<from>../path/to/frontend/app/dist</from>
<into>/app/resources/static</into>
</path>
</paths>
</extraDirectories>

Use Grab to Download Ivy Dependency Jar when Artifact name is not the same as Module name

Within a Jenkins groovy script I'm trying to download a dependency using the following:
#Grab(group='myorg', module='SuiteCreator', version='1.16.1', conf='jar', transitive=false)
import myorg.myorgAPI
I have a /home/jenkins/.groovy/grapeConfig.xml file with the following:
<?xml version="1.0" encoding="UTF-8"?>
<ivy-settings>
<settings defaultResolver="downloadGrapes"/>
<resolvers>
<chain name="downloadGrapes">
<sftp user="admin" userPassword="pw" host="ivy.myorg.com" name="myrepository" checkmodified="true">
<ivy pattern="/data/ivy/repo/[organisation]/[module]/[branch]/[revision]/ivy-[revision].xml"/>
<artifact pattern="/data/ivy/repo/[organisation]/[module]/[branch]/[revision]/[artifact]-[revision].[ext]"/>
</sftp>
</chain>
</resolvers>
</ivy-settings>
The ivy-1.16.1.xml of the Module I've trying to grab:
<?xml version="1.0" encoding="UTF-8"?>
<ivy-module version="1.0">
<info organisation="myorg" module="SuiteCreator" branch="master" revision="1.16.1" status="release" publication="20190417105814"/>
<configurations>
<conf name="jar" description="Distribution jar"/>
</configurations>
<publications>
<artifact name="myorg-suitecreator" type="jar" ext="jar" conf="jar"/>
</publications>
</ivy-module>
So I'm just trying to grab the artifact: myorg-suitecreator-1.16.1.jar.
When I run my groovy script in Jenkins I get the following error:
2019.07.09 18:06:15 org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup failed:
2019.07.09 18:06:15 General error during conversion: Error grabbing Grapes -- [download failed:
myorg#SuiteCreator#master;1.16.1!SuiteCreator.jar]
2019.07.09 18:06:15
2019.07.09 18:06:15 java.lang.RuntimeException: Error grabbing Grapes -- [download failed: myorg#SuiteCreator#master;1.16.1!SuiteCreator.jar]
2019.07.09 18:06:15 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
From the error it seems Grape is assuming the Ivy Artifact Name is the same as the Module name. The ivy-1.16.1.xml Artifact Name at: /ivy-module/publications/artifact/#name is defined as myorg-suitecreator However Grab appears to be attempt to download: SuiteCreator.jar.
The artifact pattern in grapeConfig.xml is:
<artifact pattern="/data/ivy/repo/[organisation]/[module]/[branch]/[revision]/[artifact]-[revision].[ext]"/>
And so the file I'm trying to grab is actually: /data/ivy/repo/myorg/SuiteCreator/1.16.1/myorg-suitecreator-1.16.1.jar
Does anyone have any suggestions on how to get this to work (or if Grab can download artifacts with different artifact name to the module name from Ivy?).
I gave up trying to use Grab to achieve this. I found another limitation of Grab is that it doesn't allow the specification of the branch of the artifact you wish to retrieve. I realise that not having releases on a master branch or a single release branch may not be best practice, but we do have this requirement in our dev environment.
Instead I simply used an Invoke Ant build step within Jenkins to retrieve my Ivy artifact. We use ANT already in our dev process so this was not difficult.
The ANT build.xml script is located in the same git repository as the Groovy script I wish to run. The retrieve-suite-creator target is simply an ivy-retrieve
<target name="retrieve-suite-creator" depends="clean, install-ivy">
<ivy:retrieve conf="suite-creator" type="jar" pattern="${build.dir}/[artifact].[ext]" log="${ivy.resolve.log}" settingsRef="${ivy.build.settings}"/>
</target>
Using my ivy.xml (again in the same repo as the groovy script):
<ivy-module version="1.0">
<info organisation="myorg" module="MyAutomation" status="integration" branch="${ivy.branch}"/>
<configurations>
<conf name="suite-creator" description="Configuration for Suite Creator"/>
</configurations>
<dependencies>
<dependency org="myorg" name="SuiteCreator" branch="mybranch" rev="1.16.1" conf="suite-creator->suite-creator" changing="true"/>
</dependencies>
</ivy-module>
I had to add the suite-creator ivy configuration to the SuiteCreator module's ivy.xml (in a separate SuiteCreator Git repo). I couldn't use the existing jar configuration as this also downloaded all the transitive dependencies which I didn't need.
<ivy-module version="1.0">
<info organisation="myorg" module="SuiteCreator" status="integration" branch="${ivy.branch}"/>
<configurations>
<!-- Build configurations -->
<conf name="build" description="Classes used in jar compilation"/>
<conf name="jar" description="Distribution jar"/>
<conf name="suite-creator" description="Just the myorg-suitecreator.jar"/>
</configurations>
<publications>
<artifact name="myorg-suitecreator" type="jar" ext="jar" conf="jar,suite-creator"/>
</publications>
<dependencies>
...
</dependencies>
</ivy-module>
Finally in my Jenkins job after the Invoke Ant build step, I then had an Execute Groovy Script build step, where I had to add the downloaded jar to my Class path.

Jenkins SBT plugin: How to have no tests being run treated as success?

I am building an SBT project on Jenkins with the SBT plugin, which works fine so far. The goals I execute are compile testQuick. This causes Jenkins to only run tests affecting things that changed since the last Git Push.
However, when I only reorganize things in my build files or bump a version number, this causes no tests to be run, and the post-build action "Publish JUnit test results" to fail.
Is there a way to treat no tests being run as a success instead?
Not really a solution, but a makeshift workaround until anyone comes up with a better idea:
I created a "dummy test result file" dummyresult.xml which looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
<testsuite name="DummyResultSoBuildDoesNotFail" errors="0" tests="0" failures="0" time="0" timestamp="2013-05-24T10:23:58" />
<testsuite name="DummyResultSoBuildDoesNotFail.constructor" errors="0" skipped="1" tests="1" failures="0" time="0.006" timestamp="2013-05-24T10:23:58">
<properties>
<property name="java.vendor" value="Sun Microsystems Inc." />
<property name="compiler.debug" value="on" />
<property name="project.jdk.classpath" value="jdk.classpath.1.6" />
</properties>
<testcase classname="DummyResultSoBuildDoesNotFail.constructor" name="Dummy result, needed so result collection does not fail" time="0">
<skipped />
</testcase>
</testsuite>
</testsuites>
Then I added a post build step which is "Execute shell" and executes the following:
mkdir -p target/test-reports && cp ~/dummyresult.xml target/test-reports
Now, even when no tests are run, Jenkins picks up one skipped "Dummy test" and is happy.

Dynamically create the version number within the Ambari's metainfo.xml file using maven build processes

I don’t want to hardcode my service version into metainfo.xml, Can I do it?
<service>
<name>DUMMY_APP</name>
<displayName>My Dummy APP</displayName>
<comment>This is a distributed app.</comment>
<version>0.1</version> --------------This I don't want to hardcode, Can I doit?
<components>
...
</components>
</service>
I am using maven as my build tool.
This can be done by using maven's resource filtering. Three steps are required:
Define a maven property that will hold the version number
Add that maven property in between filter tags in the metainfo.xml file
Notate in the pom that the metainfo.xml file needs to be filtered.
For example lets assume you want to use the same version as the projects maven version identifier, ${project.version}, as your version in the metainfo.xml file. You would replace <version>0.1</version> with <version>${project.version}</version>. Then in your pom file you would need to list that metainfo.xml file as needing to be filtered. The procedure for this step will vary depending on how you are bundling the custom Ambari service (rpm, assembly, etc.). In general whichever plugin you are using when you list the sources (content to include in the bundle) you will need to specify the path to the metainfo.xml file and make sure filtering is set to true.
Now lets assume you want to create an rpm that will install your artifact. It would look something like this:
Your project structure should be as follows:
--src
--main
--resources
--configuration
--scripts
metainfo.xml
Your pom file would look like this:
<?xml version="1.0"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<version>0.0.1-SNAPSHOT</version>
<artifactId>com.example.project</artifactId>
<packaging>pom</packaging>
<properties>
<hdp.name>HDP</hdp.name>
<hdp.version>2.3</hdp.version>
<stack.dir.prefix>/var/lib/ambari-server/resources/stacks</stack.dir.prefix>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>rpm-maven-plugin</artifactId>
<version>2.1.2</version>
<extensions>true</extensions>
<executions>
<execution>
<id>generate-hdp-rpm</id>
<phase>package</phase>
<goals>
<goal>attached-rpm</goal>
</goals>
<configuration>
<classifier>hdp</classifier>
<needarch>true</needarch>
<sourceEncoding>UTF-8</sourceEncoding>
<distribution>blah</distribution>
<group>something/Services</group>
<packager>company</packager>
<vendor>company</vendor>
<name>SERVICENAME-ambari-hdp</name>
<defineStatements>
<!-- I use the below line to prevent compiled python scripts from failing the build -->
<defineStatement>_unpackaged_files_terminate_build 0</defineStatement>
<defineStatement>platform_stack_directory ${stack.dir.prefix}/${hdp.name}/${hdp.version}</defineStatement>
</defineStatements>
<requires>
<require>ambari-server</require>
</requires>
<mappings>
<mapping>
<directory>${stack.dir.prefix}/${hdp.name}/${hdp.version}/services/SERVICENAME</directory>
<filemode>755</filemode>
<username>root</username>
<groupname>root</groupname>
</mapping>
<mapping>
<directory>${stack.dir.prefix}/${hdp.name}/${hdp.version}/services/SERVICENAME</directory>
<directoryIncluded>false</directoryIncluded>
<filemode>755</filemode>
<username>root</username>
<groupname>root</groupname>
<sources>
<source>
<location>src/main/resources/metainfo.xml</location>
<filter>true</filter>
</source>
</sources>
</mapping>
<mapping>
<directory>${stack.dir.prefix}/${hdp.name}/${hdp.version}/services/SERVICENAME/configuration</directory>
<filemode>755</filemode>
<username>root</username>
<groupname>root</groupname>
</mapping>
<mapping>
<directory>${stack.dir.prefix}/${hdp.name}/${hdp.version}/services/SERVICENAME/configuration</directory>
<directoryIncluded>false</directoryIncluded>
<filemode>755</filemode>
<username>root</username>
<groupname>root</groupname>
<sources>
<source>
<location>src/main/resources/configuration</location>
</source>
</sources>
</mapping>
<mapping>
<directory>${stack.dir.prefix}/${hdp.name}/${hdp.version}/services/SERVICENAME/package</directory>
<filemode>755</filemode>
<username>root</username>
<groupname>root</groupname>
</mapping>
<mapping>
<directory>${stack.dir.prefix}/${hdp.name}/${hdp.version}/services/SERVICENAME/package/scripts</directory>
<filemode>755</filemode>
<username>root</username>
<groupname>root</groupname>
</mapping>
<mapping>
<directory>${stack.dir.prefix}/${hdp.name}/${hdp.version}/services/SERVICENAME/package/scripts</directory>
<directoryIncluded>false</directoryIncluded>
<filemode>755</filemode>
<username>root</username>
<groupname>root</groupname>
<sources>
<source>
<location>src/main/resources/scripts</location>
</source>
</sources>
</mapping>
</mappings>
</configuration>
</execution>
<!-- You may have multiple executions if you want to create rpms for stacks other then HDP -->
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<!-- List any dependencies you need -->
</dependencies>
This will create an rpm that when installed will add your service to the HDP 2.3 stack. After installing the rpm you would have to restart ambari-server to make sure the new service definition is picked up.
Update:
To create additional RPMs for other stacks, you will need to:
Duplicate the execution block in the rpm-maven-plugin section of the pom.
Change the id element of the new execution to be unique.
Modify the mappings to reflect the directory/file structure you want for the other stack.

Camunda BPM 7.2: register engine-plugin in processes.xml

according to docs, you can register a plugin in processes.xml.
(see http://docs.camunda.org/latest/guides/user-guide/#process-engine-process-engine-plugins-configuring-process-engine-plugins).
The XSD says that element 'process-engine' is a complex-type and has an attribute 'name' (see http://camunda.org/schema/1.0/ProcessApplication.xsd).
But when i deploy my process-application with the following processes.xml to camunda-bpm-wildfly-7.2.0, i get this error:
19:16:08,547 ERROR [org.jboss.msc.service.fail] (MSC service thread 1-5) MSC000001: Failed to start service jboss.deployment.unit."firstTestProject-0.0.1-SNAPSHOT.war".POST_MODULE: org.jboss.msc.service.StartException in service jboss.deployment.unit."firstTestProject-0.0.1-SNAPSHOT.war".POST_MODULE: JBAS018733: Failed to process phase POST_MODULE of deployment "firstTestProject-0.0.1-SNAPSHOT.war"
at org.jboss.as.server.deployment.DeploymentUnitPhaseService.start(DeploymentUnitPhaseService.java:166) [wildfly-server-8.1.0.Final.jar:8.1.0.Final]
at org.jboss.msc.service.ServiceControllerImpl$StartTask.startService(ServiceControllerImpl.java:1948) [jboss-msc-1.2.2.Final.jar:1.2.2.Final]
at org.jboss.msc.service.ServiceControllerImpl$StartTask.run(ServiceControllerImpl.java:1881) [jboss-msc-1.2.2.Final.jar:1.2.2.Final]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [rt.jar:1.7.0_21]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [rt.jar:1.7.0_21]
at java.lang.Thread.run(Thread.java:722) [rt.jar:1.7.0_21]
Caused by: org.camunda.bpm.engine.ProcessEngineException: cvc-type.3.1.1: Element 'process-engine' is a simple type, so it cannot have attributes, excepting those whose namespace name is identical to 'http://www.w3.org/2001/XMLSchema-instance' and whose [local name] is one of 'type', 'nil', 'schemaLocation' or 'noNamespaceSchemaLocation'. However, the attribute, 'name' was found. | vfs:/content/firstTestProject-0.0.1-SNAPSHOT.war/WEB-INF/classes/META-INF/processes.xml | line 6 | column 34
My processes.xml:
<?xml version="1.0" encoding="UTF-8" ?>
<process-application
xmlns="http://www.camunda.org/schema/1.0/ProcessApplication" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<process-archive name="myBpmApp">
<process-engine name="default">
<plugins>
<plugin>
<class>myrealm.bpm.camunda.engine.plugin.MyIdentityProviderPlugin</class>
</plugin>
</plugins>
</process-engine>
<properties>
<property name="isDeleteUponUndeploy">false</property>
<property name="isScanForProcessDefinitions">true</property>
</properties>
</process-archive>
What do i miss?
Thanks!
The process engine configuration must be outside of the process-archive declaration. Inside the process-archive configuration you can only specify the name of the process engine to use (cf. docs).
For example:
<?xml version="1.0" encoding="UTF-8"?>
<bpm-platform xmlns="http://www.camunda.org/schema/1.0/BpmPlatform"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<process-engine name="my-engine">
<!-- your configuration -->
</process-engine>
<process-archive>
<process-engine>my-engine</process-engine>
<properties>
<property name="isDeleteUponUndeploy">false</property>
<property name="isScanForProcessDefinitions">true</property>
</properties>
</process-archive>
</bpm-platform>

Resources