AWS CDK Pipelines using with an existing codepipeline - aws-cdk

The documentation of #aws-cdk/pipelines seems to suggest that a CDK pipeline can be added to an existing #aws-cdk/aws-codepipeline/Pipeline, using the codePipeline prop: https://docs.aws.amazon.com/cdk/api/latest/docs/#aws-cdk_pipelines.CodePipeline.html
codePipeline? Pipeline An existing Pipeline to be reused and built upon.
However, I am not able to get this to work and am experiencing multiple errors at the cdk synth step, depending on how I try to set it up. As far as I can tell there isn't really any documentation yet to cover this scenario.
Essentially, we are trying to create a pipeline that runs something like:
clone
lint / typecheck / unit test
cdk deploy to test environment
integration tests
deploy to preprod
smoke test
manual approval
deploy to prod
I guess it's just not clear the difference between this codebuild pipeline and the cdk pipeline. Also, the naming convention of stages seems a little unclear - referencing this issue: https://github.com/aws/aws-cdk/issues/15945
See: https://github.com/ChrisSargent/cdk-issues/blob/pipelines/lib/cdk-test-stack.ts and below:
import * as cdk from "#aws-cdk/core";
import * as pipelines from "#aws-cdk/pipelines";
import * as codepipeline from "#aws-cdk/aws-codepipeline";
import * as codepipeline_actions from "#aws-cdk/aws-codepipeline-actions";
export class CdkTestStack extends cdk.Stack {
constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const cdkInput = pipelines.CodePipelineSource.gitHub(
"ChrisSargent/cdk-issues",
"pipelines"
);
// Setup the code source action
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.GitHubSourceAction({
owner: "ChrisSargent",
repo: "cdk-issues",
branch: "pipelines",
actionName: "SourceAction",
output: sourceOutput,
oauthToken: cdk.SecretValue.secretsManager("git/ChrisSargent"),
});
const pipeline = new codepipeline.Pipeline(this, "Pipeline", {
stages: [
{
actions: [sourceAction],
stageName: "GitSource",
},
],
});
const cdkPipeline = new pipelines.CodePipeline(this, "CDKPipeline", {
codePipeline: pipeline,
synth: new pipelines.ShellStep("Synth", {
// Without input, we get: Error: CodeBuild action 'Synth' requires an input (and the pipeline doesn't have a Source to fall back to). Add an input or a pipeline source.
// With input, we get:Error: Validation failed with the following errors: Source actions may only occur in first stage
input: cdkInput,
commands: ["yarn install --frozen-lockfile", "npx cdk synth"],
}),
});
// Produces: Stage 'PreProd' must have at least one action
// pipeline.addStage(new MyApplication(this, "PreProd"));
// Produces: The given Stage construct ('CdkTestStack/PreProd') should contain at least one Stack
cdkPipeline.addStage(new MyApplication(this, "PreProd"));
}
}
class MyApplication extends cdk.Stage {
constructor(scope: cdk.Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props);
console.log("Nothing to deploy");
}
}
Any guidance or experience with this would be much appreciated.

I'm able to achieve something similar by adding waves/stages with only pre and post steps into the CDK pipelines, sample code is listed as below, I'm amending on your original code snippet:
import * as cdk from "#aws-cdk/core";
import * as pipelines from "#aws-cdk/pipelines";
import * as codepipeline from "#aws-cdk/aws-codepipeline";
import * as codepipeline_actions from "#aws-cdk/aws-codepipeline-actions";
export class CdkTestStack extends cdk.Stack {
constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const cdkInput = pipelines.CodePipelineSource.gitHub(
"ChrisSargent/cdk-issues",
"pipelines"
);
const cdkPipeline = new pipelines.CodePipeline(this, "CDKPipeline", {
selfMutation: true,
crossAccountKeys: true, //can be false if you don't need to deploy to a different account.
pipelineName,
synth: new pipelines.ShellStep("Synth", {
// Without input, we get: Error: CodeBuild action 'Synth' requires an input (and the pipeline doesn't have a Source to fall back to). Add an input or a pipeline source.
// With input, we get:Error: Validation failed with the following errors: Source actions may only occur in first stage
input: cdkInput,
commands: ["yarn install --frozen-lockfile", "npx cdk synth"],
primaryOutputDirectory: 'cdk.out'
}),
});
// add any additional test step here, they will run parallels in waves
cdkPipeline.addWave('test', {post: [provideUnitTestStep(this, 'unitTest')]});
// add a manual approve step if needed.
cdkPipeline.addWave('promotion', {post: [new ManualApprovalStep('PromoteToUat')]});
// Produces: Stage 'PreProd' must have at least one action
// pipeline.addStage(new MyApplication(this, "PreProd"));
// Produces: The given Stage construct ('CdkTestStack/PreProd') should contain at least one Stack
cdkPipeline.addStage(new MyApplication(this, "PreProd"));
}
}
class MyApplication extends cdk.Stage {
constructor(scope: cdk.Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props);
console.log("Nothing to deploy");
}
}
What's noticing is that you might need to covert the way you write your Codebuild action to the new cdk CodeBuildStep. A sample unit test step could is like below:
const provideUnitTestStep = (
id: string
): cdkpipeline.CodeBuildStep => {
const props: CodeBuildStepProps = {
partialBuildSpec: codebuild.BuildSpec.fromObject({
version: '0.2',
env: {
variables: {
DEFINE_VARIBLES: 'someVariables'
}
},
phases: {
install: {
commands: [
'install some dependencies',
]
},
build: {
commands: [
'run some test!'
]
}
}
}),
commands: [],
buildEnvironment: {
buildImage: codebuild.LinuxBuildImage.STANDARD_5_0
}
};
return new cdkpipeline.CodeBuildStep(`${id}`, props);
};
It's not so trivial(and straight forward enough) to retrive the underline CodeBuild project Role, you will need to pass in rolePolicyStatements property in the CodeBuildStep props to grant extra permission needed for your test.

First of all, the error Pipeline must have at least two stages is correct.
You only got the GitHub checkout/clone command as a single stage.
For a second stage, you could use a CodeBuild project to compile/lint/unit test... as you mentioned.
However, what would you like to do with your compiled artifacts then?
Build containers to deploy them later?
If so, there are better ways with CDK of doing this (DockerImageAsset).
This also could save up your preexisting pipeline and you can use the CDK Pipeline directly.
Could you please try to set the property restartExecutionOnUpdate: true,
of your regular Pipeline, like in my following snippet?
const pipeline = new codepipeline.Pipeline(this, "Pipeline", {
restartExecutionOnUpdate: true,
stages: [
{
actions: [sourceAction],
stageName: "GitSource",
},
],
});
This is needed for the self-mutation capability of the CDK pipeline.

This happened to me when I was creating a pipeline in a stack without specifically defined account and region.
Check if you have env like this:
new CdkStack(app, 'CdkStack', {
env: {
account: awsProdAccount,
region: defaultRegion,
}
});

Related

How can I reference my constant within a Jenkins Parameter?

I have the following code in a Pipelineconstant.groovy file:
public static final list ACTION_CHOICES = [
N_A,
FULL_BLUE_GREEN,
STAGE,
FLIP,
CLEANUP
]
and this PARAMETERS in Jenkins multi-Rapper-file:
parameters {
string (name: 'ChangeTicket', defaultValue: '000000', description : 'Prod change ticket otherwise 000000')
choice (name: 'AssetAreaName', choices: ['fpukviewwholeof', 'fpukdocrhs', 'fpuklegstatus', 'fpukbooksandjournals', 'fpukleglinks', 'fpukcasesoverview'], description: 'Select the AssetAreaName.')
/* groovylint-disable-next-line DuplicateStringLiteral */
choice (name: 'AssetGroup', choices: ['pdc1c', 'pdc2c'])
}
I would like to ref ACTION_CHOICES in the parameter as this:
choice (name: 'Action', choices: constants.ACTION_CHOICES, description: 'Multi Version deployment actions')
but it doesn't work for me.
I tried to do this:
choice (name: 'Action', choices: constants.ACTION_CHOICES, description: 'Multi Version deployment actions')
but it doesn't work for me.
You're almost there! Jenkinsfile(s) can be extended with variables / constants defined (directly in your file or (better I'd say) from a Jenkins shared library (this scenario).
The parameter syntax within you pipeline was fine as well as the idea of lists of constants, but what was missing: a proper interlink of those parts together - proper library import. See example below (the names below in the example are not carved in stone and can be of course changed but watch out - Jenkins is quite sensitive about filenames, paths, ... (especially in shared libraries]):
Pipelineconstant.groovy should be placed in src/org/pipelines of your Jenkins shared library.
Pipelineconstant.groovy
package org.pipelines
class Pipelineconstant {
public static final List<String> ACTION_CHOICES = ["N_A", "FULL_BLUE_GREEN", "STAGE", "FLIP", "CLEANUP"]
}
and then you can reference this list of constants within your Jenkinsfile pipeline.
Jenkinsfile
#Library('jsl-constants') _
import org.pipelines.Pipelineconstant
pipeline {
agent any
parameters {
choice (name: 'Action', choices: Pipelineconstant.ACTION_CHOICES , description: 'Multi Version deployment actions')
}
// rest of your pipeline code
}
The first two lines of the pipeline are important - the first loads the JSL itself! Therefore the second line of that import can be used (otherwise Jenkins would not know where find that Pipelineconstant.groovy file.
B) Without Jenkins shared library (files in one repo):
I've found this topic discussed and solved for scripted pipeline here: Load jenkins parameters from external groovy file

AWS CDK, running code on first deploy but not after

Goal
I want to be able to create a lambda function with CDK, but then manage the docker image that the lambda uses with a CI/CD pipeline (github actions)
What I have done
I have the following code:
this.repository =
this.config.repository ??
new ecr.Repository(this, 'Repository', {
repositoryName: this.config.repositoryName,
});
this.lambda = new lambda.DockerImageFunction(this, 'DockerLambda', {
code: lambda.DockerImageCode.fromImageAsset(
path.join(__dirname, '../docker/minimal'),
{ cmd: this.config.cmd, entrypoint: this.config.entrypoint },
),
functionName: config.functionName ?? this.node.id,
environment: config.environment,
timeout: Duration.seconds(config.timeout ?? 600),
memorySize: config.memorySize ?? 1024,
vpc: config.vpc,
vpcSubnets: config.vpcSubnets ?? {
subnets: config.vpc?.privateSubnets,
},
});
I am doing it this way because there doesn't appear to be a way to create a lambda without specifying where the code will come from. The 'minimal' docker is just a generic placeholder, it will eventually get replaced by the real code. That code does not live in the repository where we have our CDK code, so CDK does not have access to build the real docker image.
So, the steps that we follow are:
Use this generic DockerImageLambda construct to create both an ECR repository, and a lambda with a placeholder docker image. This ECR repository is where github will be uploading the real images, but until then, it will be empty (since it was just created).
Use Github actions to upload a real docker image to the ECR repository created in step #1
Use Github actions to update the lambda function with the new image from step #2
The Problem
This method works until you change something in the lambda CDK code. At that point, it will try to reconfigure the lambda to use the placeholder docker image, which essentially "breaks" what was working there.
The question
How can I make it use the placeholder docker image only the first time the lambda is created? OR, is there a better way to do this?
You can decouple uploading the asset to ECR from the lambda definition.
To upload to the repository you created, use the cdk-ecr-deployment construct. Then create the lambda with the correct ECR repository from the beginning. You will not need to edit the lambda to change the source ECR repository.
You also need to make your Lambda construct depend on the deployment, so that when the lambda is created, the repository contains your dummy image.
It would look like this:
this.repository =
this.config.repository ??
new ecr.Repository(this, 'Repository', {
repositoryName: this.config.repositoryName,
});
const dummyImage = DockerImageAsset(
path.join(__dirname, '../docker/minimal')
)
const dummyDeployment = new ECRDeployment(this, 'DummyImage',
{ src: new DockerImageName(dummyImage.imageUri),
dest: new DockerImageName(this.repository.repositoryUriForTagOrDigest('latest')
})
this.lambda = new lambda.DockerImageFunction(this, 'DockerLambda', {
code: lambda.DockerImageCode.fromEcr(
this.repository,
{ cmd: this.config.cmd, entrypoint: this.config.entrypoint },
),
functionName: config.functionName ?? this.node.id,
environment: config.environment,
timeout: Duration.seconds(config.timeout ?? 600),
memorySize: config.memorySize ?? 1024,
vpc: config.vpc,
vpcSubnets: config.vpcSubnets ?? {
subnets: config.vpc?.privateSubnets,
},
});
this.lambda.node.addDependency(dummyDeployment)
You could import the real ECR into your CDK stack with the fromXXXXX help methods.
https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_ecr.Repository.html#static-fromwbrrepositorywbrarnscope-id-repositoryarn

How do I get AWS CDK stack env values for bootstrapping an environment?

AWS and other sources consider explicitly specifying the AWS account and region for each stack as best practice. I'm trying to write a CI pipeline that will bootstrap my environments. However, I'm not seeing any straight-forward way to retrieve the stack's explicit env values from here:
regions.forEach((region) =>
new DbUpdateStack(app, `${stackBaseName}-prd-${region}`, {
env: {
account: prdAccount,
region: region
},
environment_instance: 'prd',
vpc_id: undefined,
})
);
EG, base-name-prd-us-east-1 knows the region and account as defined in the code but how do I access this from the command line without doing something hacky?
I need to run cdk bootstrap with those values and I don't want to duplicate them.
The Cloud Assembly module can introspect an App's stack environments. Synth the app, then instantiate a CloudAssembly class by pointing at the cdk output directory:
import * as cx_api from '#aws-cdk/cx-api';
(() => {
const cloudAssembly = new cx_api.CloudAssembly('cdk.out');
const appEnvironments = cloudAssembly.stacks.map(stack => stack.environment);
console.log(appEnvironments);
})();
Result:
[
{
account: '123456789012',
region: 'us-east-1',
name: 'aws://123456789012/us-east-1',
},
];

How do i add Input transformation to a target using aws cdk for a cloudwatch event rule?

After i create a cloud-watch event rule i am trying to add a target to it but i am unable to add a input transformation. Previously the add target had the props allowed for input transformation but it does not anymore.
codeBuildRule.addTarget(new SnsTopic(props.topic));
The aws cdk page provides this solution but i dont exactly understand what it says
You can add additional targets, with optional input transformer using eventRule.addTarget(target[, input]). For example, we can add a SNS topic target which formats a human-readable message for the commit.
You should specify the message prop and use RuleTargetInput static methods. Some of these methods can use strings returned by EventField.fromPath():
// From a path
codeBuildRule.addTarget(new SnsTopic(props.topic, {
message: events.RuleTargetInput.fromEventPath('$.detail')
}));
// Custom object
codeBuildRule.addTarget(new SnsTopic(props.topic, {
message: RuleTargetInput.fromObject({
foo: EventField.fromPath('$.detail.bar')
})
}));
I had the same question trying to implement this tutorial in CDK: Tutorial: Set up a CloudWatch Events rule to receive email notifications for pipeline state changes
I found this helpful as well: Detect and react to changes in pipeline state with Amazon CloudWatch Events
NOTE: I could not get it to work using the Pipeline's class method onStateChange().
I ended up writing a Rule:
const topic = new Topic(this, 'topic', {topicName: 'codepipeline-notes-failure',
});
const description = `Generated by the CDK for stack: ${this.stackName}`;
new Rule(this, 'failed', {
description: description,
eventPattern: {
detail: {state: ['FAILED'], pipeline: ['notes']},
detailType: ['CodePipeline Pipeline Execution State Change'],
source: ['aws.codepipeline'],
},
targets: [
new SnsTopic(topic, {
message: RuleTargetInput.fromText(
`The Pipeline '${EventField.fromPath('$.detail.pipeline')}' has ${EventField.fromPath(
'$.detail.state',
)}`,
),
}),
],
});
After implementing, if you navigate to Amazon EventBridge -> Rules, then select the rule, then select the Target(s) and then click View Details you will see the Target Details with the Input transformer & InputTemplate.
Input transformer:
{"InputPathsMap":{"detail-pipeline":"$.detail.pipeline","detail-state":"$.detail.state"},"InputTemplate":"\"The
Pipeline '<detail-pipeline>' has <detail-state>\""}
This would work for CDK Python. CodeBuild to SNS notifications.
sns_topic = sns.Topic(...)
codebuild_project = codebuild.Project(...)
sns_topic.grant_publish(codebuild_project)
codebuild_project.on_build_failed(
f'rule-on-failed',
target=events_targets.SnsTopic(
sns_topic,
message=events.RuleTargetInput.from_multiline_text(
f"""
Name: {events.EventField.from_path('$.detail.project-name')}
State: {events.EventField.from_path('$.detail.build-status')}
Build: {events.EventField.from_path('$.detail.build-id')}
Account: {events.EventField.from_path('$.account')}
"""
)
)
)
Credits to #pruthvi-raj comment on an answer above

How to setup AWS CDK app execution in AWS CodeBuild?

I want to run AWS CDK synthesis from Git repository using AWS CodeBuild - i.e. if I update the CDK app code in the repo I want CloudFormation stacks to be updated automatically. What are the best practices for setting up build role permissions?
For a GitHub repository, your CodeBuild role doesn't need additional permissions but it should have access to an oauthToken to access GitHub.
For a CodeCommit repository, create or import a codecommit.Repository object and use a CodeCommitSource object for your source parameter, and the build role permissions will be set up automatically (in particular, the permissions that will be added will be to codecommit:GitPull from the indicated repository).
See here.
You might also be interested in CDK's app-delivery package. It doesn't just create a CodeBuild project though, it uses CodePipeline to fetch, build and deploy a CDK application, so it might be more than you are looking for.
AWS released a month ago a new class to the CDK suite called pipelines that includes several utilities to ease the job of setting up self modifying pipelines. In addition, there's codepipeline-actions that includes constructs to hook your pipeline to CodeCommit, GitHub, BitBucket, etc...
Here's a complete example (verbatim from the linked blog post), using github as a source, that deploys a lambda through CodePipeline:
Create a stage with your stack
import { CfnOutput, Construct, Stage, StageProps } from '#aws-cdk/core';
import { CdkpipelinesDemoStack } from './cdkpipelines-demo-stack';
/**
* Deployable unit of web service app
*/
export class CdkpipelinesDemoStage extends Stage {
public readonly urlOutput: CfnOutput;
constructor(scope: Construct, id: string, props?: StageProps) {
super(scope, id, props);
const service = new CdkpipelinesDemoStack(this, 'WebService');
// Expose CdkpipelinesDemoStack's output one level higher
this.urlOutput = service.urlOutput;
}
}
Create a stack with your pipeline
import * as codepipeline from '#aws-cdk/aws-codepipeline';
import * as codepipeline_actions from '#aws-cdk/aws-codepipeline-actions';
import { Construct, SecretValue, Stack, StackProps } from '#aws-cdk/core';
import { CdkPipeline, SimpleSynthAction } from "#aws-cdk/pipelines";
/**
* The stack that defines the application pipeline
*/
export class CdkpipelinesDemoPipelineStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const sourceArtifact = new codepipeline.Artifact();
const cloudAssemblyArtifact = new codepipeline.Artifact();
const pipeline = new CdkPipeline(this, 'Pipeline', {
// The pipeline name
pipelineName: 'MyServicePipeline',
cloudAssemblyArtifact,
// Where the source can be found
sourceAction: new codepipeline_actions.GitHubSourceAction({
actionName: 'GitHub',
output: sourceArtifact,
oauthToken: SecretValue.secretsManager('github-token'),
owner: 'OWNER',
repo: 'REPO',
}),
// How it will be built and synthesized
synthAction: SimpleSynthAction.standardNpmSynth({
sourceArtifact,
cloudAssemblyArtifact,
// We need a build step to compile the TypeScript Lambda
buildCommand: 'npm run build'
}),
});
// This is where we add the application stages
// ...
}
}

Resources