Gradle publishToMavenLocal - jenkins

When I use gradle clean build publishToMavenLocal with
publishing {
publications {
maven(MavenPublication) {
from components.java
}
}
}
using gradle 7.1.
It works on local gradle build. However, when I run on Jenkins, I get:
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException: Required context class hudson.FilePath is missing
Perhaps you forgot to surround the code with a step that provides this, such as: node
When, I comment out the publishing section in build.gradle it works in Jenkins.
Any ideas?
Thanks

Publishing to Maven local requires disk space, or working space.
Based on: https://support.cloudbees.com/hc/en-us/articles/4402585187483-How-to-troubleshoot-hudson-FilePath-is-missing-in-a-Pipeline-run

Related

How to create an allure report in multi module project

I am able to create allure report for my automation tests at individual module level and works fine locally but that doesn't seem to work on Jenkins where all modules are run as part of pipeline and i guess jenkins is unable to find an aggregated allure report at project directory level. Is there a way to handle/solve this? Any suggestions appreciated.
enter image description here
On Jenkins an empty report is attached and log reads -
$ /opt/fsroot/tools/ru.yandex.qatools.allure.jenkins.tools.AllureCommandlineInstallation/2.13.1/bin/allure generate -c -o /opt/fsroot/workspace/Flow/~/--tests/allure-report
allure-results does not exists
Report successfully generated to /opt/fsroot/workspace/Flow/~/--tests/allure-report
Allure report was successfully generated.
prerequisite : allure plugin is installed in Jenkins
After test stage use below in post stage
post {
always {
allure includeProperties: false, jdk: '', results: [[path: 'target/allure-results']]
}
}
You can always manually copy results from your modules to the project's root if the core plugin functionality doesn't work for you for some reason. Here's a Gradle example. But you can do the same for the Maven project as well.
tasks.register<Copy>("copyApiResults") {
from("${projectDir}/api/build/allure-results")
into("${buildDir}/allure-results")
}
tasks.register<Copy>("copyUiResults") {
from("${projectDir}/ui/build/allure-results")
into("${buildDir}/allure-results")
}

IntelliJ, Groovy, Jenkins, and unresolved access

I am building a shared library that is used to run pipeline on jenkins.
Frequently I used things like:
def call(String stage_name = "Generate installer") {
//noinspection GrUnresolvedAccess
stage(stage_name) {
...
}
}
stage is a Jenkins steps. It generate an "unresolved access" warning in IntelliJ because it's not declared everywhere. Hopefully it does not generate an error as if it was a missing class import.
However I do wonder if there's a better solution than putting suppress warning declarations. Is there any way to indicate to IntelliJ the existence of the Jenkins steps ?

How build a Dockerfile in a Subdirectory using a Jenkinsfile

I have a github repository with a declarative pipeline Jenkinsfile.
The workingdirectory on my node contains subdirectories.
The project is a simple empty linkx docker project created with visual studio 2017 and .net core 2.1. It executes on my windows 7 machine normally and has a hello world web page.
I am unable to build the Dockerfile on jenkins.
I can start the Dockerfile build using dir(...){}.
The failing step is always
COPY ["MyProject/MyProject.csproj", "MyProject/"]
This step requires the relative path to be in MySolution.
The file Workspace/MySolution/MyProject/MyProject.csproj exists
The error Message is that Workspace/MyProject/MyProject.csproj does not exist.
I searched exhaustively using google and stackoverflow. Among the things I tries are combinations of sh commands, dir syntax, options on docker build like -f. Some of them were straigth up failures and the best results I had ran into the COPY step issue.
One example of a failing step in the Jenkinsfile would be:
dir("MySolution/MyProject")
{
script
{
docker.build("MyProject", ".")
}
}
This fails with the COPY issue from above.
I have seen questions on so that seem to not quite apply here and which solutions did not transfer to this issue.
It turns out I ended up really close to the solution of my issue.
This fixed my sub folder problem:
dir("MySolution")
{
script
{
docker.build("MyProject", "-f ./MyProject/Dockerfile .")
}
}

How do I make jar files available to ant in a Jenkins pipeline?

I've put together a basic Jenkins pipeline and it does what I expect for the most part.
However, I'm using ant and it requires access to specific jar files. I've specified the build step like so:
stage('Build') {
// Build the project
env.PATH = "${tool 'ant'}/bin:${env.PATH}"
sh 'ant -f dita-tools/build_all.xml -lib $WORKSPACE/dita-ot/lib:$WORKSPACE/dita-ot/lib/saxon'
}
The build I run through this pipeline fails and generates the following error:
java.lang.ClassNotFoundException: org.dita.dost.module.GenMapAndTopicListModule
From what I can tell, this is due to ant not having access to dost.jar that is in the dita ot. I've tried defining this argument a number of ways including specifically referencing dost.jar (I have a number of jars to include) but every time it fails with the same error.
When I put together a stand-alone ant project in Jenkins, ant has no problem accessing the jar by way of the argument I provide above. Is there a better way for me to supply this argument/dependency in a pipeline?
UPDATE:
I added an echo statement for the classpath to my build script and was able to verify that adding the jars to the classpath in the build script does in fact work. So, for all intents and purposes, ant has access to all the relevant base toolkit jars for the target but the error persists. At this point, it appears that the problem has something to do with how the jenkins pipeline works as opposed to the dita ot itself?
I assume you use custom plugins, if yes, please make sure, you correctly defined your jars in the plugin.xml like so:
<feature extension="dita.conductor.lib.import" file="lib/my.jar"/>
UPDATE
java.lang.ClassNotFoundException: org.dita.dost.module.GenMapAndTopicListModule
This error means, that the main DITA-OT jar is not found on your classpath. So this indicates, that this is not a plugin issue.
Usually you don't have to setup the classpath, Ant does that for you. Please also read Creating an Ant build script.
Please try a snippet like this one:
node {
try {
checkout scm
stage('Build') {
sh '''
dir=$(pwd)
curl [your-dita-ot-url] --output dita-ot.zip
unzip -qq "$dir/dita-ot.zip"
rm dita-ot.zip
chmod +x ${WORKSPACE}/dita-ot/bin/ant
${WORKSPACE}/dita-ot/bin/ant -f ${WORKSPACE}/build.xml -Ddita.dir=$dir/dita-ot -Dbranch.name=$BRANCH_NAME
'''
}
} catch (e) {
currentBuild.result = "FAILED"
throw e
} finally {
notifyBuild(currentBuild.result)
}
}

How to run groovy scripts in gradle project from windows command line

I have created one gradle project for Geb-Spock. In this project I have created scripts in Groovy class.
I would like to execute those scripts from command line with the help of gradle commands. So that I can achieve Jenkins - Gradle - Geb integration.
Can you please help me to get the command which can execute the gradle-groovy scripts from windows command line. Thanks for your help on this.
Maybe so. If I understood correctly, you want run it like:
gradle myScript
You define a Task in gradle and import your groovy class into build.gradle like:
import com.MyScript
task myScript(){
new MyScript().run()
}
You also need a dependecy for your script (look her). e.g.:
buildscript {
repositories {
mavenCentral()
}
dependencies {
classpath files('path/to/myScript/lib')
}
}
Also take a look how to compile the code for buildscript classpath if you need: https://docs.gradle.org/current/userguide/organizing_build_logic.html#sec:build_sources
I'm not really sure that you need all that. If you just want to run some Geb-Spock Test in Gradle take a look in https://github.com/geb/geb-example-gradle

Resources