how can i add separete stages for the task and subtask in odoo - task

add the boolean in project task type and give the domain for the project task
#api.model
def _read_group_stage_ids(self, stages, domain, order):
# res = super(ProjectTask, self)._read_group_stage_ids(self, stages, domain, order)
# return res
search_domain = [('id', 'in', stages.ids)]
if 'default_project_id' in self.env.context:
search_domain = [('sub_task_stage', '=', False),'|', ('project_ids', '=', self.env.context['default_project_id'])] + search_domain
stage_ids = stages._search(search_domain, order=order, access_rights_uid=SUPERUSER_ID)
return stages.browse(stage_ids)
but inside the project task and subtask it shows all stages how can i get the stages separately in task and sub task

Related

How to get the agent that each stage is running on in a declarative jenkins pipeline using Groovy?

I'm trying to get the agent name that each stage is running on. I can get this by making use of env.NODE_NAME, but then I'll have to write some code in each stage block to capture env.NODE_NAME from that stage.
I'm instead trying to capture the agents for all the stages at one place by getting the stage type FlowNodes via the PipelineNodeGraphVisitor, and getting the agent that these stage flowNodes are executing on. I read about making use of WorkSpaceAction to get the agent node, and I was able to get the agent via this.
Since my pipeline just has one agent defined at the pipeline level, I got that each stage is using the same agent. So, I tried using a different agent, say, 'test' for one stage and expected to get a different agent for that stage via WorkSpaceAction, but I'm getting the same agent for it as the one defined at Pipeline level and not the 'test' agent.
Please help if anyone knows. I've been stuck on this for quite a long time now.
Here's the groovy code I used:
WorkflowRun run = Jenkins.instance.getItemByFullName(env.JOB_NAME)._getRuns()[0]
FlowExecution exec = run.getExecution()
PipelineNodeGraphVisitor visitor = new PipelineNodeGraphVisitor(run)
def flowNodes = visitor.getPipelineNodes()
for (Iterator iterator = flowNodes.iterator(); iterator.hasNext();)
{
def node = iterator.next()
if (node.getType() == FlowNodeWrapper.NodeType.STAGE) //Get Stage Type flownodes
{
String stageName = node.getDisplayName()
print "${stageName}"
for (FlowNode enclosing : node.getNode().iterateEnclosingBlocks()) {
WorkspaceAction ws = enclosing.getAction(WorkspaceAction.class)
if (ws != null) {
print "${ws.getNode()}"
}
}
}
}

How to get all branches/jobs of a multibranch pipeline job?

Is there a way to get the names of all branches that the scan of a multibranch pipeline job has gathered?
I would like to set up a nightly build with dependencies on existing build jobs and therefore need to check if the multibranch jobs contain some certain branches. An other way would be to check for an existing job.
I found a way by using the Jenkins API.
In case anyone else is having this question: here is my groovy solution:
(Critics and edits welcome)
import java.util.ArrayList
import hudson.model.*;
def ArrayList<String> call(String pipelineName) {
def hi = hudson.model.Hudson.instance;
def item = hi.getItemByFullName(pipelineName);
def jobs = item.getAllJobs();
def arr = new ArrayList<String>();
Iterator<?> iterator = jobs.iterator();
while (iterator.hasNext()) {
def job = iterator.next();
arr.add(pipelineName + "/" + job.name);
}
return arr;
}

Multi branch Pipeline plugin load multiple jenkinsfile per branch

I am able to load Jenkinsfile automatically through multi branch pipeline plugin with a limitation of only one jenkinsfile per branch.
I have multiple Jenkinsfiles per branch which I want to load, I have tried with below method by creating master Jenkins file and loading specific files. In below code it merges 1.Jenkinsfile and 2.Jenkinsfile as one pipeline.
node {
git url: 'git#bitbucket.org:xxxxxxxxx/pipeline.git', branch: 'B1P1'
sh "ls -latr"
load '1.Jenkinsfile'
load '2.Jenkinsfile'
}
Is there a way I can load multiple Jenkins pipeline code separately from one branch?
I did this writing a share library (ref https://jenkins.io/doc/book/pipeline/shared-libraries/) containing the following file (in vars/generateJobsForJenkinsfiles.groovy):
/**
* Creates jenkins pipeline jobs from pipeline script files
* #param gitRepoName name of github repo, e.g. <organisation>/<repository>
* #param filepattern ant style pattern for pipeline script files for which we want to create jobs
* #param jobPath closure of type (relativePathToPipelineScript -> jobPath) where jobPath is a string of formated as '<foldername>/../<jobname>' (i.e. jenkins job path)
*/
def call(String gitRepoName, String filepattern, def jobPath) {
def pipelineJobs = []
def base = env.WORKSPACE
def pipelineFiles = new FileNameFinder().getFileNames(base, filepattern)
for (pipelineFil in pipelineFiles) {
def relativeScriptPath = (pipelineFil - base).substring(1)
def _jobPath = jobPath(relativeScriptPath).split('/')
def jobfolderpath = _jobPath[0..-2]
def jobname = _jobPath[-1]
echo "Create jenkins job ${jobfolderpath.join('/')}:${jobname} for $pipelineFil"
def dslScript = []
//create folders
for (i=0; i<jobfolderpath.size(); i++)
dslScript << "folder('${jobfolderpath[0..i].join('/')}')"
//create job
dslScript << """
pipelineJob('${jobfolderpath.join('/')}/${jobname}') {
definition {
cpsScm {
scm {
git {
remote {
github('$gitRepoName', 'https')
credentials('github-credentials')
}
branch('master')
}
}
scriptPath("$relativeScriptPath")
}
}
configure { d ->
d / definition / lightweight(true)
}
}
"""
pipelineJobs << dslScript.join('\n')
//println dslScript
}
if (!pipelineJobs.empty)
jobDsl sandbox: true, scriptText: pipelineJobs.join('\n'), removedJobAction: 'DELETE', removedViewAction: 'DELETE'
}
Most likely you want to map old Jenkins' jobs (pre pipeline) operating on single branch of some project to a single multi branch pipeline. The appropriate approach would be to create stages that are input dependent (like a question user if he/she wants to deploy to staging / live).
Alternatively you could just create a new separate Pipeline jenkins job that actually references an your project's SCM and points to your other Jenkinsfile (then one pipeline job per every other jenkinsfile).

How to run same job parallel with different parameters for each run

I have a build job and a test job parameters.
I want to be after the build job, simultaneously run test job with one parameter and the same test job with different parameters in parallel execution.
build job
|
/ \
test job test job
with one params with other params
| |
How to accomplish this and whether it is possible to perform without having to write your own plugin?
We can do something like this
List<XmlSuite> suites = new ArrayList<XmlSuite>();
for (int i = 0; i < valueList.size(); i++) {
XmlSuite suite = new XmlSuite();
suite.setName("TmpSuite" + i);
XmlTest test = new XmlTest(suite);
test.setName("TmpTest" + i);
test.setParallel(ParallelMode.CLASSES);
Map<String, String> parameters = new HashMap<String, String>();
parameters.put("first-name", valueList.get(i));
test.setParameters(parameters);
List<XmlClass> classes = new ArrayList<XmlClass>();
classes.add(new XmlClass("TestClass1"));
classes.add(new XmlClass("TestClass2"));
test.setXmlClasses(classes);
suites.add(suite);
}
TestNG tng = new TestNG();
tng.setSuiteThreadPoolSize(5);
tng.setXmlSuites(suites);
tng.run();

Call a jenkins job by using a variable for build the name

I try to launch a job from a parametrized trigger and I would compute the name from a given variable.
Is it possible to set in field :
Build Triggers Projects to build
a value like this
${RELEASE}-MAIN-${PROJECT}-LOAD_START
?
Unfortunately, this isn't possible with the Build Triggers. I looked for a solution for this "higher order build job" that would allow you to create a dynamic build name with a one of the parameterized build plugins, but I couldn't find one.
However, using the Groovy Postbuild Plugin, you can do a lot of powerful things. Below is a script that can be modified to do what you want. In particular, notice that it gets environmental variables using build.buildVariables.get("MY_ENV_VAR"). The environmental variable TARGET_BUILD_JOB specifies the name of the build job to build. In your case, you would want to build TARGET_BUILD_JOB using these two environmental variables:
build.buildVariables.get("RELEASE")
build.buildVariables.get("PROJECT")
The script is commented so that if you're not familiar with Groovy, which is based off Java, it should hopefully make sense!
import hudson.model.*
import hudson.model.queue.*
import hudson.model.labels.*
import org.jvnet.jenkins.plugins.nodelabelparameter.*
def failBuild(msg)
{
throw new RuntimeException("[GROOVY] User message, exiting with error: " + msg)
}
// Get the current build job
def thr = Thread.currentThread()
def build = thr?.executable
// Get the parameters for the current build job
// For ?:, see "Elvis Operator" (http://groovy.codehaus.org/Operators#Operators-ElvisOperator)
def currentParameters = build.getAction(ParametersAction.class)?.getParameters() ?:
failBuild("There are no parameters to pass down.")
def nodeName = build.getBuiltOnStr()
def newParameters = new ArrayList(currentParameters); newParameters << new NodeParameterValue("param_NODE",
"Target node -- the node of the previous job", nodeName)
// Retrieve information about the target build job
def targetJobName = build.buildVariables.get("TARGET_BUILD_JOB")
def targetJobObject = Hudson.instance.getItem(targetJobName) ?:
failBuild("Could not find a build job with the name $targetJobName. (Are you sure the spelling is correct?)")
println("$targetJobObject, $targetJobName")
def buildNumber = targetJobObject.getNextBuildNumber()
// Add information about downstream job to log
def jobUrl = targetJobObject.getAbsoluteUrl()
println("Starting downstream job $targetJobName ($jobUrl)" + "\n")
println("======= DOWNSTREAM PARAMETERS =======")
println("$newParameters")
// Start the downstream build job if this build job was successful
boolean targetBuildQueued = targetJobObject.scheduleBuild(5,
new Cause.UpstreamCause(build),
new ParametersAction(newParameters)
);
if (targetBuildQueued)
{
println("Build started successfully")
println("Console (wait a few seconds before clicking): $jobUrl/$buildNumber/console")
}
else
failBuild("Could not start target build job")

Resources