I have my pipelines in one bitbucket repo.
├── shared_libs
(...)
├── pipelines
│ ├── group1
│ │ ├── pipelineA
│ │ │ └── Jenkinsfile
│ │ ├── pipelineB
│ │ │ └── Jenkinsfile
│ ├── group2
│ │ ├── pipelineD
│ │ │ └── Jenkinsfile
│ │ ├── pipelineC
│ │ │ └── Jenkinsfile
So far I was manually creating jobs in GUI, configuring all parameters, repo, paths to Jenkinsfile. Now I'm having hard time to find a plugin, a way to automate it. Sort of auto-discovery that after setting up repo access. It will create a job based on the Jenkinsfile, put it in a correct place in folder structure, set up all parameters. Kind of like with shared libs. Where I specify repo, path, key. And it's ready to use.
As those pipelines are not just to build and deploy but all sort of admin/cleanup work... I don't want to run them automatically. I just want to automatically add element in GUI once it's pushed to repo/merged to master.
Yes, you can automate the process. It's difficult to give a complete solution without knowing all the bits and pieces. But here is how you can do this.
You can create a new Jenkins Job which will be triggered by a Webhook from the bitbucket repo. The Webhook request will transmit the latest changed files etc. From the request, you can extract the files that were added or changed. Bsed on this information you can use the following Groovy script to create the new Job and move it to the desired directory.
def createJenkinsJob(def jobName, def folderName) {
echo "Creating the job ${jobName}"
// Here I'm using a shared library in the pipeline, so I have loaded my shared library here
// You can simply have the entire pipeline syntax here.
def jobDSL="#Library('ycr#master') _\n" +
"Pipeline()"
def flowDefinition = new org.jenkinsci.plugins.workflow.cps.CpsFlowDefinition(jobDSL, true)
def jenkins = Jenkins.instance
def job = new org.jenkinsci.plugins.workflow.job.WorkflowJob(instance, jobName )
job.definition = flowDefinition
job.setConcurrentBuild(false)
job.save()
jenkins.reload()
// After creating the Job move the job to a folder
if (folderName != null && folderName != "") {
def folder = jenkins.getItemByFullName(folderName)
Items.move(job, folder)
}
}
Related
I have the following structure:
/Jenkinsfile/script2.groovy
/Jenkinsfile/pipeline2.yaml
script1.groovy
pipeline1.yaml
There's a reference in script1 to the pipeline using:
yamlFile "pipeline1.yml"
or
yamlFile "./Jenkinsfiles/pipeline2.yaml"
And works fine. I'm trying to use the same pipeline file on script2 but can't make it work.
Here's the relevant part of the script:
pipeline {
agent {
kubernetes {
cloud "xxxx"
yamlFile "pipeline.yml"
}
}
Any idea?
Note: pipeline1 and pieline2 are the same files just showing different locations.
Given the directory structure you mentioned:
.
├── Jenkinsfile
│ ├── pipeline2.yaml
│ └── script2.groovy
├── pipeline1.yaml
└── script1.groovy
The following files can be read from within their parent directory as follows:
For script1 ran from ./
groovy ./script1.groovy is able to read both ./pipeline1.yaml and ./Jenkinsfile/pipeline2.yaml
For Script2 ran from ./
groovy ./Jenkinsfile/script2.groovy is able to read ./pipeline1.yaml, since its in the same directory the file ./Jenkinsfile/script2.groovy is being run from i.e. ./
groovy ./Jenkinsfile/script2.groovy is able to read ./Jenkinfile/pipeline2.yaml also because the path is relative.
I think you could possibly simplify this by just having the files reside in one directory. And also using the syntax readYaml(file: './nameOfFile.yaml') readyaml section.
.
├── pipeline1.yaml
├── script1.groovy
├── pipeline2.yaml
└── script2.groovy
I'm trying to make bazel build a jar and an so file for a flutter project but every time I type 'bazel build' into the command prompt I keep getting
ERROR: The 'build' command is only supported from within a workspace (below a directory having a WORKSPACE file).
See documentation at https://docs.bazel.build/versions/master/build-ref.html#workspace
I've read some documentation it seems like the solution is to create a blank file called 'WORKSPACE' but I don't understand where this file is supposed to be stored. here's a link to the documentation I read https://docs.bazel.build/versions/2.0.0/tutorial/java.html
thanks in advance!
WORKSPACE file goes to the root of your workspace (source). It's the top directory for all your build packages and start of there absolute path you'd refer too with //. For instance if you had a tree like this:
.
├── BUILD
├── a_source_file
├── package1
│ ├── BUILD
│ └── other_source
└── package2
├── BUILD
└── another_source
You would construct your workspace where all your packages converge (root they share) as:
.
├── BUILD
├── WORKSPACE
├── a_source_file
├── package1
│ ├── BUILD
│ └── other_source
└── package2
├── BUILD
└── another_source
And your build targets could then be for instance: //:a_build_target or //package2:another_target.
Problem: I have a one single repository where I have to walk through the repo to find a specific Jenkinsfile to run the pipeline. Note that I want to define the path to this Jenkinsfile explicity so I thought about having a jenkinsfilePath.yml in root directory of the repo, read the yaml, change directory and run Jenkinfile from the path. The folder structure is as follows:
testingSingleRepo
├── Jenkinsfile
├── feature_flagging
│ ├── Jenkinsfile
│ ├── __init__.py
│ ├── src
│ └── tests
└── jenkinsfilePath.yml
I am having issue running Jenkinsfile inside feature_flagging from the root Jenkinfile in testingSingleRepo. I was successful in changing directory to the folder feature_flagging by using dir. After googling a lot with similar questions, I came across the function build but I could not make that work. Any suggestions/solutions?
Calling a Jenkinsfile from a main pipeline, we can use load
load 'feature_flagging/Jenkinsfile'
So after looking around, I have decided to go with another approach. I have decided that I will have a master Jenkinfile in the root where I will have a generic pipeline setup. It will read the yaml file, change directory and execute shell scripts inside Jenkins/ folder accordingly. The folder will consist of generic scripts that reflects to the root Jenkins pipeline such as setup.sh,test.sh, deploy.sh etc. The folder structure will look something like below:
testingSingleRepo
├── Jenkinsfile
├── feature_flagging
│ ├── Jenkins/
│ ├── __init__.py
│ ├── src
│ └── tests
└── jenkinsfilePath.yml
I have an existing project which is built with Maven. It typically defines several modules. I want to migrate this project to Bazel.
In a first attempt, I use
└── project
├── moduleA
│ ├── BUILD
│ ├── pom.xml
│ └── src
│ ├── main
│ │ └── java
│ └── test
│ ├── data
│ └── java
├── moduleB
│ ├── BUILD
│ ├── pom.xml
│ └── src
│ ├── main
│ │ └── java
│ └── test
│ └── java
├── pom.xml
└── WORKSPACE
It was not too hard to make the project build with Bazel. My problem is now that tests fails to find their test data.
Indeed, with Maven (or ant), the working directory is the one that contains the pom.xml (or build.xml). So, in that case for moduelA can do:
new File("src/test/data/foo.txt");
However, when the test runs in Bazel, the working directory is the sanboxed runfiles which are rooted like the workspace, i.e. the test must now open:
new File("projectA/src/test/data/foo.txt");
This is all fine after migration, but do you handle this situation during migration, i.e. how do you make the test pass both in Maven and in Bazel?
Is there any facility offered by the Bazel test runner to adapt the paths to legacy behaviour?
The current workaround I have is to check whether new File(".").getAbsoluteFile().getParentFile() is __main__.
See TestFileUtil.java
I am using the UseLATEX.cmake to compile my project documentation folder.
My project is organized as follows --
.
├── CMakeLists.txt
├── bin
├── build
├── cmake
│ ├── CMakeCompilerFlags.cmake
│ ├── CMakeDefaults.cmake
│ ├── MacroEnsureOutOfSourceBuilds.cmake
│ └── UseLATEX.cmake
├── doc
│ ├── Doc.tex
│ ├── CMakeLists.txt
│ └── images
│ ├── img1.png
│ ├── img2.png
│ ├── img3.png
│ └── img4.jpeg
............
└── src
├── CMakeLists.txt
├── file1.cpp
├── file2.cpp
└── file3.cpp
My root level cmake file is like this ...
cmake_minimum_required(VERSION 2.8 FATAL_ERROR)
# Set path for CMake
set(CMAKE_MODULE_PATH
"${CMAKE_SOURCE_DIR}/cmake"
${CMAKE_MODULE_PATH}
)
# Define project settings
project(proj)
set(APPLICATION_NAME ${PROJECT_NAME})
include(CMakeDefaults)
# Compile Program and Docs
include_directories(inc)
add_subdirectory(src)
add_subdirectory(doc)
And the CMakeList file in the document file is --
include(UseLATEX)
ADD_LATEX_DOCUMENT(Doc.tex
#BIBFILES mybib.bib
IMAGE_DIRS images
DEFAULT_PDF
)
Now I compile my project in the build folder. Is there any way I can copy back the Doc.pdf file created in the build/doc folder back to my original build folder?
Since ADD_LATEX_DOCUMENT adds a CMake target named pdf here, you should be able to make use of add_custom_command. Try adding the following to your /doc/CMakeLists.txt after the ADD_LATEX_DOCUMENT call:
add_custom_command(TARGET pdf POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_CURRENT_BINARY_DIR}/Doc.pdf
${CMAKE_BINARY_DIR}/Doc.pdf)
This custom command invokes the cmake executable (held in the variable ${CMAKE_COMMAND}) along with the -E copy arguments every time the pdf target is built.