What is the correct format of BatchGetBuilds parameters for 'AWS CDK - StepFunction - CallAwsService' - aws-cdk

I'm working for a state machine by CDK.
And getting an issue to check the codebuild project status in the state machine...
Q. Could you let me know the correct format of batchGetBuilds parameters in CallAwsService?
import { CallAwsService } from "aws-cdk-lib/aws-stepfunctions-tasks"
import { JsonPath } from "aws-cdk-lib/aws-stepfunctions"
new CallAwsService(scope, "Check 1-1: Codebuild Status", {
service: "codebuild",
action: "batchGetBuilds",
parameters: {
Ids: [JsonPath.stringAt("$.results.codebuild.id")],
},
iamResources: ["*"],
inputPath: "$",
resultSelector: { "status.$": "$.builds[0].buildStatus" },
resultPath: "$.results.bulidAmi",
})
I tried 2 ways.
JsonPath.stringAt("$.results.codebuild.id")
Then it returns below and execution be failed.
"An error occurred while executing the state 'Check 1-1: Codebuild Status' (entered at the event id #9).
The Parameters '{\"Ids\":\"******-generate-new-ami-project:05763ec2-89a6-4b56-8b44-************\"}' could not be used to start the Task:
[Cannot deserialize instance of `java.util.ArrayList<java.lang.Object>` out of VALUE_STRING token]"
[JsonPath.stringAt("$.results.codebuild.id")]
If I use array, it is failed in the build stage... (I'm using cdk pipeline to deploy this) error message is below
Cannot use JsonPath fields in an array, they must be used in objects
+ Extra Question
I found this during the search
https://stackoverflow.com/questions/70978385/aws-step-functions-wait-for-codebuild-to-finish
Can I use this `sync` on the `CallAwsService`? (Main 1... state is using `CallAwsService` also)
If yes, how can I use it..?
Or do I need to change the `CallAwsService` to `CodeBuildStartBuild`?

Could you let me know the correct format of batchGetBuilds parameters in CallAwsService?
Use the States.Array intrinsic function. These CDK syntaxes are equivalent:
parameters = {
'Ids.$': 'States.Array($.results.codebuild.id)',
Ids: JsonPath.stringAt('States.Array($.results.codebuild.id)'),
Ids: JsonPath.array(JsonPath.stringAt('$.results.codebuild.id'))
}
Can I use this sync on the CallAwsService?
No. The CallAwsService task implements the AWS SDK service integrations, which does not support .sync for CodeBuild actions. As of v2.15, CDK should throw an error if you pass the RUN_JOB (= .sync) pattern to CallAwsService. See this github issue for context.
Or do I need to change the CallAwsService to CodeBuildStartBuild?
Yes. CodeBuildStartBuild works as expected with the RUN_JOB integration pattern.

Related

Using global shared libraries in Jenkins to define parameter options

I am trying to use a global class that I've defined in a shared library to help organise job parameters. It's not working, and I'm not even sure if it is possible.
My job looks something like this:
pipelineJob('My-Job') {
definition {
// Job definition goes here
}
parameters {
choiceParam('awsAccount', awsAccount.ALL)
}
}
In a file in /vars/awsAccount.groovy I have the following code:
class awsAccount implements Serializable {
final String SANDPIT = "sandpit",
final String DEV = "dev",
final String PROD = "prod"
static String[] ALL = [SANDPIT, DEV, PROD]
}
Global pipeline libraries are configured to load implicitly from the my repository's master branch.
When attempting to update the DSL scripts I receive the error:
ERROR: (myJob.groovy, line 67) No such property: awsAccount for class: javaposse.jobdsl.dsl.helpers.BuildParametersContext
Why does it not find the class, and is it even possible to use shared library classes like this in pipeline job?
Disclaimer: I know it works using Jenkinsfile. Unfortunatelly, not tested usng Declarative Pipelines - but no answers yet, so it may be worth a try
Regarding your first question: there are some reasons why a class from your shared-lib could not be found. Starting from the library import, the library syntax, etc. But they definitvely work for DSL. To be more precise about it, additional information would be great. But be sure that:
You have your groovy class definition using exactly the directory structure as described in the documentation (https://www.jenkins.io/doc/book/pipeline/shared-libraries/)
Give a name to the shared-lib in jenkins as you configure it and be sure is exactly the name you use in the import
Use the import as described in the documentation (under Using Libraries)
Regarding your second question (the one that names this SO question): yes, you can include parameter jobs from information in your shared-lib. At least, using Jenkinsfiles. You can even define properties to be included in the pipelie. I got it working with a tricky syntax due to different problems.
Again, I am using Jenkinsfile and this is what worked for me:
In my shared-lib class, I added a static function that introduces the build parameters. Notice the input parameters that function needs and its usage:
class awsAccount implements Serializable {
//
static giveMeParameters (script) {
return [
// Some parms
script.string(defaultValue: '', description: 'A default parameter', name: 'textParm'),
script.booleanParam(defaultValue: false, description: 'If set to True, do whatever you need - otherwise, do not do it', name: 'boolOption'),
]
}
}
To introduce those parameters in the pipeline, you need to place the returned value of the function into the parameters array
properties (
parameters (
awsAccount.giveMeParameters (this)
)
Again, notice the syntax when calling the function. Similar to this, you can also define functions in the shared-lib that return properties and use them in multiple jobs (disableConcurrentBuilds, buildDiscarder, etc)

Invoke block passed to pipeline step with parameters, from plugin

I'm trying to write a Jenkins plugin that provides Step myStep which expects a block with a single parameter per below
myStep { someParameter -> <user code> }
I've found that BodyInvoker ( retrieved from StepContext.newBodyInvoker() ) provides no facilities to invoke the user provided block with parameters.
Expanding the environment would not be ideal, even though the type of the parameter is serializable ( to/from String ), i'd have to provide additional helpers to carry out this serialization, e.g
myStep { deserialize "${env.value}" <user code> }
do i have any other option to pass a non-string type in to the provided block? would type information of the parameter survive even if i did?
nb: i understand you can return a value from your Execution.run() which will be the return value of the step in the pipeline. It's just that in a related shared pipeline library i'm already heavily leaning in to this pattern of:
withFoo { computedFoo ->
# something with computedFoo
withBar computedFoo { computedBar ->
}
}
i prefer this over
computedFoo = withFoo
# something with computedFoo
withBar(computedFoo)
..then again, i couldn't find any plugins pulling this off.
no matter how close i look at workflow-step-api-plugin this doesn't seem possible today. The options are:
expand the environment context with a string value
add a custom object to the context ( requires access to step context in pipeline )
use a return value

How do i add Input transformation to a target using aws cdk for a cloudwatch event rule?

After i create a cloud-watch event rule i am trying to add a target to it but i am unable to add a input transformation. Previously the add target had the props allowed for input transformation but it does not anymore.
codeBuildRule.addTarget(new SnsTopic(props.topic));
The aws cdk page provides this solution but i dont exactly understand what it says
You can add additional targets, with optional input transformer using eventRule.addTarget(target[, input]). For example, we can add a SNS topic target which formats a human-readable message for the commit.
You should specify the message prop and use RuleTargetInput static methods. Some of these methods can use strings returned by EventField.fromPath():
// From a path
codeBuildRule.addTarget(new SnsTopic(props.topic, {
message: events.RuleTargetInput.fromEventPath('$.detail')
}));
// Custom object
codeBuildRule.addTarget(new SnsTopic(props.topic, {
message: RuleTargetInput.fromObject({
foo: EventField.fromPath('$.detail.bar')
})
}));
I had the same question trying to implement this tutorial in CDK: Tutorial: Set up a CloudWatch Events rule to receive email notifications for pipeline state changes
I found this helpful as well: Detect and react to changes in pipeline state with Amazon CloudWatch Events
NOTE: I could not get it to work using the Pipeline's class method onStateChange().
I ended up writing a Rule:
const topic = new Topic(this, 'topic', {topicName: 'codepipeline-notes-failure',
});
const description = `Generated by the CDK for stack: ${this.stackName}`;
new Rule(this, 'failed', {
description: description,
eventPattern: {
detail: {state: ['FAILED'], pipeline: ['notes']},
detailType: ['CodePipeline Pipeline Execution State Change'],
source: ['aws.codepipeline'],
},
targets: [
new SnsTopic(topic, {
message: RuleTargetInput.fromText(
`The Pipeline '${EventField.fromPath('$.detail.pipeline')}' has ${EventField.fromPath(
'$.detail.state',
)}`,
),
}),
],
});
After implementing, if you navigate to Amazon EventBridge -> Rules, then select the rule, then select the Target(s) and then click View Details you will see the Target Details with the Input transformer & InputTemplate.
Input transformer:
{"InputPathsMap":{"detail-pipeline":"$.detail.pipeline","detail-state":"$.detail.state"},"InputTemplate":"\"The
Pipeline '<detail-pipeline>' has <detail-state>\""}
This would work for CDK Python. CodeBuild to SNS notifications.
sns_topic = sns.Topic(...)
codebuild_project = codebuild.Project(...)
sns_topic.grant_publish(codebuild_project)
codebuild_project.on_build_failed(
f'rule-on-failed',
target=events_targets.SnsTopic(
sns_topic,
message=events.RuleTargetInput.from_multiline_text(
f"""
Name: {events.EventField.from_path('$.detail.project-name')}
State: {events.EventField.from_path('$.detail.build-status')}
Build: {events.EventField.from_path('$.detail.build-id')}
Account: {events.EventField.from_path('$.account')}
"""
)
)
)
Credits to #pruthvi-raj comment on an answer above

Jenkins Job DSL: Using parameters in groovyScript in job step

For my build job "generated-job-1" I need several parameters, which are passed in when the build (of the generated-job-1) is triggered via URL.
Here my Job Definition with parameters inside the SeedJob DSL:
job('generated-job-1'){
label ('master')
parameters{
stringParam('DEPLOY_URI', 'https://192.168.200.176/hyperManager', 'Provide the URL where DeploymentManager can be accessed.')
stringParam('REG_ID', '12', 'The id of the owner (Registration) of this deployment.')
}
steps {
groovyCommand(readFileFromWorkspace('stepscript.groovy')){
prop('name', 'value')
prop('DEPLOY_URI', $DEPLOY_URI)
}
}
}
I tried to use DEPLOY_URI, $DEPLOY_URI and ${DEPLOY_URI} and it the build fails with different error messages like
No such property: DEPLOY_URI for class: javaposse.jobdsl.dsl.helpers.step.GroovyContext
or
ERROR: (script, line 12) No such property: $DEPLOY_URI for class: javaposse.jobdsl.dsl.helpers.step.GroovyContext
or
ERROR: (script, line 12) No signature of method: javaposse.jobdsl.dsl.helpers.step.GroovyContext.$() is applicable for argument types: (script$_run_closure1$_closure3$_closure4$_closure5) values: [script$_run_closure1$_closure3$_closure4$_closure5#1a11cf0]
How can I define and pass those parameters to my step-script.groovy?
How could I use those parameters in other steps, such as shell or batchFile?
How do I access those parameters in my step-script.groovy, to work with the given data?
I searched for a while now and tried hard to get it working... No success.
Help really appreciated, as I am new to Job DSL and to Groovy.
Thanks in advance,
Anne
You need to put the variable name in quotes so that it gets evaluated when the generated job is executed, not when the DSL script runs.
job('generated-job-1') {
parameters {
stringParam('DEPLOY_URI', '...', '...')
}
steps {
groovyCommand(readFileFromWorkspace('stepscript.groovy')) {
prop('DEPLOY_URI', '$DEPLOY_URI')
}
}
}

From Jenkins, how do I get a list of the currently running jobs in JSON?

I can find out just about everything about my Jenkins server via the Remote API, but not the list of currently running jobs.
This,
http://my-jenkins/computer/api/json
or
http://my-jenkins/computer/(master)/api/json
Would seem like the most logical choices, but they say nothing (other than the count of jobs) about which jobs are actually running.
There is often confusion between jobs and builds in Jenkins, especially since jobs are often referred to as 'build jobs'.
Jobs (or 'build jobs' or 'projects') contain configuration that describes what to run and how to run it.
Builds are executions of a job. A build contains information about the start and end time, the status, logging, etc.
See https://wiki.jenkins-ci.org/display/JENKINS/Building+a+software+project for more information.
If you want the jobs that are currently building (i.e. have one or more running builds), the fastest way is to use the REST API with XPath to filter on colors that end with _anime, like this:
http://jenkins.example.com/api/xml?tree=jobs[name,url,color]&xpath=/hudson/job[ends-with(color/text(),%22_anime%22)]&wrapper=jobs
will give you something like:
<jobs>
<job>
<name>PRE_DB</name>
<url>http://jenkins.example.com/job/my_first_job/</url>
<color>blue_anime</color>
</job>
<job>
<name>SDD_Seller_Dashboard</name>
<url>http://jenkins.example.com/job/my_second_job/</url>
<color>blue_anime</color>
</job>
</jobs>
Jenkins uses the color field to indicate the status of the job, where the _anime suffix indicates that the job is currently building.
Unfortunately, this won't give you any information on the actual running build. Multiple instances of the job maybe running at the same time, and the running build is not always the last one started.
If you want to list all the running builds, you can also use the REST API to get a fast answer, like this:
http://jenkins.example.com/computer/api/xml?tree=computer[executors[currentExecutable[url]],oneOffExecutors[currentExecutable[url]]]&xpath=//url&wrapper=builds
Will give you something like:
<builds>
<url>http://jenkins.example.com/job/my_first_job/1412/</url>
<url>http://jenkins.example.com/job/my_first_job/1414/</url>
<url>http://jenkins.example.com/job/my_second_job/13126/</url>
</builds>
Here you see a list off all the currently running builds. You will need to parse the URL to separate the job name from the build number. Notice how my_first_job has two builds that are currently running.
I have a view defined using View Job Filters Plugin that filters just currently running jobs, then you can use /api/json on the view page to see just the jobs that are running. I also have one for aborted, unstable, etc.
UPDATE
Select Edit View → Job Filters → Add Job Filter ▼ → Build Statuses Filter
Build Statuses: ☑ Currently Building
Match Type: Exclude Unmatched - ...
Bit of a hack but I think you can infer what jobs are currently running by looking at the color key in the job objects when you do a GET at /jenkins/api/json?pretty=true. If the 'ball' icon for a given job in Jenkins is animated, we know it's running.
Have a look at the array of job objects in the JSON response:
{
...
"jobs" : [
{
"name" : "Test Job 1",
"url" : "http://localhost:8000/jenkins/job/Test%20Job%201/",
"color" : "blue"
},
{
"name" : "Test Job 2",
"url" : "http://localhost:8000/jenkins/job/Test%20Job%202/",
"color" : "blue_anime"
}
...
}
In this case "color" : "blue_anime" indicates that the job is currently running, and "color" : "blue" indicates that the job is not running.
Hope this helps.
Marshal the output and filter for "building: true" from the following call to json api on a job with tree to filter out the extraneous stuff (hope this helps):
http://jenkins.<myCompany>.com/job/<myJob>/api/json?pretty=true&depth=2&tree=builds[builtOn,changeSet,duration,timestamp,id,building,actions[causes[userId]]]
will give you something like:
{
"builds" : [
{
"actions" : [
{
},
{
"causes" : [
{
"userId" : "cheeseinvert"
}
]
},
{
},
{
},
{
},
{
}
],
"building" : true,
"duration" : 0,
"id" : "2013-05-07_13-20-49",
"timestamp" : 1367958049745,
"builtOn" : "serverA",
"changeSet" : {
}
}, ...
You can do this with the jenkins tree api, using an endpoint like this:
http://<host>/api/json?tree=jobs[name,lastBuild[building,timestamp]]
You can see what attributes from lastBuild you can use if you access <job-endpoint>/lastBuild/api/json.
I had a similar problem where some pipeline builds get stuck in the building state after I restart jenkins (piepline jobs are supposed to be durable and resume but most of the time they get stuck indefinitely).
These builds do not use an executor so the only way to find them is to open every job.
All of the other answers seem to work when the project is considered building, i.e.: the last build is building. But they ignore past builds still building.
The following query works for me and gives me all the currently running builds, i.e.: they do not have a result.
http://localhost:8080/api/xml?tree=jobs[name,builds[fullDisplayName,id,number,timestamp,duration,result]]&xpath=/hudson/job/build[count(result)=0]&wrapper=builds
Nothing worked me properly. I copied and modified code form python-jenkins. Since Master node name changed , it was giving exception. Did'nt want to rely on plugin.
def get_running_builds():
builds = []
nodes = server.get_nodes()
for node in nodes:
# the name returned is not the name to lookup when
# dealing with master :/
if node['name'] == 'Built-In Node':
continue
if node['name'] == 'master':
node_name = '(master)'
else:
node_name = node['name']
try:
info = server.get_node_info(node_name, depth=2)
except server.JenkinsException as e:
# Jenkins may 500 on depth >0. If the node info comes back
# at depth 0 treat it as a node not running any jobs.
if ('[500]' in str(e) and
server.get_node_info(node_name, depth=0)):
continue
else:
raise
for executor in info['executors']:
executable = executor['currentExecutable']
if executable and 'number' in executable:
#print(f'{executable}')
executor_number = executor['number']
build_number = executable['number']
url = executable['url']
m = re.search(r'/job/([^/]+)/.*', urlparse(url).path)
job_name = m.group(1)
builds.append({'name': executable['fullDisplayName'],
'number': build_number,
'url': url,
'node': node_name,
'executor': executor_number,
'timestamp': executable['timestamp']})
return builds
timestamp gives time in millisecs.

Resources