Goal
I want to be able to create a lambda function with CDK, but then manage the docker image that the lambda uses with a CI/CD pipeline (github actions)
What I have done
I have the following code:
this.repository =
this.config.repository ??
new ecr.Repository(this, 'Repository', {
repositoryName: this.config.repositoryName,
});
this.lambda = new lambda.DockerImageFunction(this, 'DockerLambda', {
code: lambda.DockerImageCode.fromImageAsset(
path.join(__dirname, '../docker/minimal'),
{ cmd: this.config.cmd, entrypoint: this.config.entrypoint },
),
functionName: config.functionName ?? this.node.id,
environment: config.environment,
timeout: Duration.seconds(config.timeout ?? 600),
memorySize: config.memorySize ?? 1024,
vpc: config.vpc,
vpcSubnets: config.vpcSubnets ?? {
subnets: config.vpc?.privateSubnets,
},
});
I am doing it this way because there doesn't appear to be a way to create a lambda without specifying where the code will come from. The 'minimal' docker is just a generic placeholder, it will eventually get replaced by the real code. That code does not live in the repository where we have our CDK code, so CDK does not have access to build the real docker image.
So, the steps that we follow are:
Use this generic DockerImageLambda construct to create both an ECR repository, and a lambda with a placeholder docker image. This ECR repository is where github will be uploading the real images, but until then, it will be empty (since it was just created).
Use Github actions to upload a real docker image to the ECR repository created in step #1
Use Github actions to update the lambda function with the new image from step #2
The Problem
This method works until you change something in the lambda CDK code. At that point, it will try to reconfigure the lambda to use the placeholder docker image, which essentially "breaks" what was working there.
The question
How can I make it use the placeholder docker image only the first time the lambda is created? OR, is there a better way to do this?
You can decouple uploading the asset to ECR from the lambda definition.
To upload to the repository you created, use the cdk-ecr-deployment construct. Then create the lambda with the correct ECR repository from the beginning. You will not need to edit the lambda to change the source ECR repository.
You also need to make your Lambda construct depend on the deployment, so that when the lambda is created, the repository contains your dummy image.
It would look like this:
this.repository =
this.config.repository ??
new ecr.Repository(this, 'Repository', {
repositoryName: this.config.repositoryName,
});
const dummyImage = DockerImageAsset(
path.join(__dirname, '../docker/minimal')
)
const dummyDeployment = new ECRDeployment(this, 'DummyImage',
{ src: new DockerImageName(dummyImage.imageUri),
dest: new DockerImageName(this.repository.repositoryUriForTagOrDigest('latest')
})
this.lambda = new lambda.DockerImageFunction(this, 'DockerLambda', {
code: lambda.DockerImageCode.fromEcr(
this.repository,
{ cmd: this.config.cmd, entrypoint: this.config.entrypoint },
),
functionName: config.functionName ?? this.node.id,
environment: config.environment,
timeout: Duration.seconds(config.timeout ?? 600),
memorySize: config.memorySize ?? 1024,
vpc: config.vpc,
vpcSubnets: config.vpcSubnets ?? {
subnets: config.vpc?.privateSubnets,
},
});
this.lambda.node.addDependency(dummyDeployment)
You could import the real ECR into your CDK stack with the fromXXXXX help methods.
https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_ecr.Repository.html#static-fromwbrrepositorywbrarnscope-id-repositoryarn
Related
I'm trying to create an infrastructure with AWS CDK. When creating a lambda, it forces me to specify the code that's going in it.
However, that'll be the responsibility of the release pipeline.
Is there a way to create a lambda without specifying the code?
No. code is a required prop in the CDK Lambda Function construct*. Use the InlineCode class as a minimal placeholder:
new lambda.Function(this, "Lambda", {
code: new lambda.InlineCode(
"exports.handler = async (event) => console.log(event)"
),
runtime: lambda.Runtime.NODEJS_18_X,
handler: "index.handler",
});
* It's also required for the CDK L1 CfnFunction. For what it's worth, Code is also a required input in the CreateFunction API and SDK commands.
AWS and other sources consider explicitly specifying the AWS account and region for each stack as best practice. I'm trying to write a CI pipeline that will bootstrap my environments. However, I'm not seeing any straight-forward way to retrieve the stack's explicit env values from here:
regions.forEach((region) =>
new DbUpdateStack(app, `${stackBaseName}-prd-${region}`, {
env: {
account: prdAccount,
region: region
},
environment_instance: 'prd',
vpc_id: undefined,
})
);
EG, base-name-prd-us-east-1 knows the region and account as defined in the code but how do I access this from the command line without doing something hacky?
I need to run cdk bootstrap with those values and I don't want to duplicate them.
The Cloud Assembly module can introspect an App's stack environments. Synth the app, then instantiate a CloudAssembly class by pointing at the cdk output directory:
import * as cx_api from '#aws-cdk/cx-api';
(() => {
const cloudAssembly = new cx_api.CloudAssembly('cdk.out');
const appEnvironments = cloudAssembly.stacks.map(stack => stack.environment);
console.log(appEnvironments);
})();
Result:
[
{
account: '123456789012',
region: 'us-east-1',
name: 'aws://123456789012/us-east-1',
},
];
Running cdk deploy after updating my Stack:
export function createTaskXXXX (stackScope: Construct, workflowContext: WorkflowContext) {
const lambdaXXXX = new lambda.Function(stackScope, 'XXXXFunction', {
runtime: Globals.LAMBDA_RUNTIME,
memorySize: Globals.LAMBDA_MEMORY_MAX,
code: lambda.Code.fromAsset(CDK_MODULE_ASSETS_PATH),
handler: 'xxxx-handler.handler',
timeout: Duration.minutes(Globals.LAMBDA_DURATION_2MIN),
environment: {
YYYY_ENV: (workflowContext.production) ? 'prod' : 'test',
YYYY_A_LOCATION: `s3://${workflowContext.S3ImportDataBucket}/adata-workflow/split-input/`,
YYYY_B_LOCATION: `s3://${workflowContext.S3ImportDataBucket}/bdata-workflow/split-input/` <--- added
}
})
lambdaXXXX.addToRolePolicy(new iam.PolicyStatement({
effect: Effect.ALLOW,
actions: ['s3:PutObject'],
resources: [
`arn:aws:s3:::${workflowContext.S3ImportDataBucket}/adata-workflow/split-input/*`,
`arn:aws:s3:::${workflowContext.S3ImportDataBucket}/bdata-workflow/split-input/*` <---- added
]
}))
I realize that those changes are not updated at stack.template.json:
...
"Runtime": "nodejs12.x",
"Environment": {
"Variables": {
"YYYY_ENV": "test",
"YYYY_A_LOCATION": "s3://.../adata-workflow/split-input/"
}
},
"MemorySize": 3008,
"Timeout": 120
}
...
I have cleaned cdk.out and tried the deploy --force, but never see any updates.
Is it deleting the stack and redeploy the only final alternative, or am i missing something? I think at least at synth should generate different results.
(i also changed to cdk 1.65.0 in my local system to match the package.json)
Thanks.
EDITED: I git clone the project, and did npm install and cdk synth again and finally saw the changes, i would like not to do this everytime, any light of what could be blocking the correct synth generation?
EDITED 2: After a diff between the bad old project and the new from git where synth worked, i realized that some of my project files that had .ts (for example cdk.ts my App definition) also had replicas with .js and .d.ts., such as cdk.js and cdk.d.ts. Could i have runned some command by mistake that compiled Typescript, i will continue to investigate, thanks to all answers.
because CDK uses Cloudformation, it performs an action to determine a ChangeSet. This is to say, if it doesn't think anything has changed, it wont change that resource.
This can, of course, be very annoying as sometimes it thinks it is the same and doesn't update when there is actually a change - I find this most often with Layers and using some form of make file to generate the zips for the layers. Even tho it makes a 'new' zip whatever it uses to determine that the zip is updated recalls it as the same because of ... whatever compression/hash/ect changes are used.
You can get around this by updating the description with a datetime. Its assigned at synth (which is part of the cdk deploy) and so if you do a current now() of datetime
You can also use cdk diff to see what it thinks the changes are.
And finally... always remember to save your file before deployments as, depending on your IDE, it may not be available to the command line ;)
I think it will update where I see the code, but I don't know why it can't.
It is advisable to comment out the part about Lambda once and deploy it, then uncomment it and deploy it again, then recreate Lambda.
This is how I do it. Works nicely so far. Basically you can do the following:
Push your lambda code as a zip file to an s3 bucket. The bucket must have versioning enabled. .
The CDK code below will do the following:
Create a custom resource. It basically calls s3.listObjectVersions for my lambda zip file in S3. I grab the first returned value, which seems to be the most recent object version all the time (I cannot confirm this with the documentation though). I also create a role for the custom resource.
Create the lambda and specify the code as the zip file in s3 AND THE OBJECT VERSION RETURNED BY THE CUSTOM RESOURCE! That is the most important part.
Create a new lambda version.
Then the lambda's code updates when you deploy the CDK stack!
const versionIdKey = 'Versions.0.VersionId';
const isLatestKey = 'Versions.0.IsLatest'
const now = new Date().toISOString();
const role = new Role(this, 'custom-resource-role', {
assumedBy: new ServicePrincipal('lambda.amazonaws.com'),
});
role.addManagedPolicy(ManagedPolicy.fromAwsManagedPolicyName('AdministratorAccess')); // you can make this more specific
// I'm not 100% sure this gives you the most recent first, but it seems to be doing that every time for me. I can't find anything in the docs about it...
const awsSdkCall: AwsSdkCall = {
action: "listObjectVersions",
parameters: {
Bucket: buildOutputBucket.bucketName, // s3 bucket with zip file containing lambda code.
MaxKeys: 1,
Prefix: LAMBDA_S3_KEY, // S3 key of zip file containing lambda code
},
physicalResourceId: PhysicalResourceId.of(buildOutputBucket.bucketName),
region: 'us-east-1', // or whatever region
service: "S3",
outputPaths: [versionIdKey, isLatestKey]
};
const customResourceName = 'get-object-version'
const customResourceId = `${customResourceName}-${now}` // not sure if `now` is neccessary...
const response = new AwsCustomResource(this, customResourceId, {
functionName: customResourceName,
installLatestAwsSdk: true,
onCreate: awsSdkCall,
onUpdate: awsSdkCall,
policy: AwsCustomResourcePolicy.fromSdkCalls({resources: AwsCustomResourcePolicy.ANY_RESOURCE}), // you can make this more specific
resourceType: "Custom::ListObjectVersions",
role: role
})
const fn = new Function(this, 'my-lambda', {
functionName: 'my-lambda',
description: `${response.getResponseField(versionIdKey)}-${now}`,
runtime: Runtime.NODEJS_14_X,
memorySize: 1024,
timeout: Duration.seconds(5),
handler: 'index.handler',
code: Code.fromBucket(buildOutputBucket, LAMBDA_S3_KEY, response.getResponseField(versionIdKey)), // This is where the magic happens. You tell CDK to use a specific S3 object version when updating the lambda.
currentVersionOptions: {
removalPolicy: RemovalPolicy.DESTROY,
},
});
new Version(this, `version-${now}`, { // not sure if `now` is neccessary...
lambda: fn,
removalPolicy: RemovalPolicy.DESTROY
})
Do note:
For this to work, you have to upload your lambda zip code to S3 before each cdk deploy. This can be the same code as before, but the s3 bucket versioning will create a new version. I use code pipeline to do this as part of additional automation.
The documentation of #aws-cdk/pipelines seems to suggest that a CDK pipeline can be added to an existing #aws-cdk/aws-codepipeline/Pipeline, using the codePipeline prop: https://docs.aws.amazon.com/cdk/api/latest/docs/#aws-cdk_pipelines.CodePipeline.html
codePipeline? Pipeline An existing Pipeline to be reused and built upon.
However, I am not able to get this to work and am experiencing multiple errors at the cdk synth step, depending on how I try to set it up. As far as I can tell there isn't really any documentation yet to cover this scenario.
Essentially, we are trying to create a pipeline that runs something like:
clone
lint / typecheck / unit test
cdk deploy to test environment
integration tests
deploy to preprod
smoke test
manual approval
deploy to prod
I guess it's just not clear the difference between this codebuild pipeline and the cdk pipeline. Also, the naming convention of stages seems a little unclear - referencing this issue: https://github.com/aws/aws-cdk/issues/15945
See: https://github.com/ChrisSargent/cdk-issues/blob/pipelines/lib/cdk-test-stack.ts and below:
import * as cdk from "#aws-cdk/core";
import * as pipelines from "#aws-cdk/pipelines";
import * as codepipeline from "#aws-cdk/aws-codepipeline";
import * as codepipeline_actions from "#aws-cdk/aws-codepipeline-actions";
export class CdkTestStack extends cdk.Stack {
constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const cdkInput = pipelines.CodePipelineSource.gitHub(
"ChrisSargent/cdk-issues",
"pipelines"
);
// Setup the code source action
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.GitHubSourceAction({
owner: "ChrisSargent",
repo: "cdk-issues",
branch: "pipelines",
actionName: "SourceAction",
output: sourceOutput,
oauthToken: cdk.SecretValue.secretsManager("git/ChrisSargent"),
});
const pipeline = new codepipeline.Pipeline(this, "Pipeline", {
stages: [
{
actions: [sourceAction],
stageName: "GitSource",
},
],
});
const cdkPipeline = new pipelines.CodePipeline(this, "CDKPipeline", {
codePipeline: pipeline,
synth: new pipelines.ShellStep("Synth", {
// Without input, we get: Error: CodeBuild action 'Synth' requires an input (and the pipeline doesn't have a Source to fall back to). Add an input or a pipeline source.
// With input, we get:Error: Validation failed with the following errors: Source actions may only occur in first stage
input: cdkInput,
commands: ["yarn install --frozen-lockfile", "npx cdk synth"],
}),
});
// Produces: Stage 'PreProd' must have at least one action
// pipeline.addStage(new MyApplication(this, "PreProd"));
// Produces: The given Stage construct ('CdkTestStack/PreProd') should contain at least one Stack
cdkPipeline.addStage(new MyApplication(this, "PreProd"));
}
}
class MyApplication extends cdk.Stage {
constructor(scope: cdk.Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props);
console.log("Nothing to deploy");
}
}
Any guidance or experience with this would be much appreciated.
I'm able to achieve something similar by adding waves/stages with only pre and post steps into the CDK pipelines, sample code is listed as below, I'm amending on your original code snippet:
import * as cdk from "#aws-cdk/core";
import * as pipelines from "#aws-cdk/pipelines";
import * as codepipeline from "#aws-cdk/aws-codepipeline";
import * as codepipeline_actions from "#aws-cdk/aws-codepipeline-actions";
export class CdkTestStack extends cdk.Stack {
constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const cdkInput = pipelines.CodePipelineSource.gitHub(
"ChrisSargent/cdk-issues",
"pipelines"
);
const cdkPipeline = new pipelines.CodePipeline(this, "CDKPipeline", {
selfMutation: true,
crossAccountKeys: true, //can be false if you don't need to deploy to a different account.
pipelineName,
synth: new pipelines.ShellStep("Synth", {
// Without input, we get: Error: CodeBuild action 'Synth' requires an input (and the pipeline doesn't have a Source to fall back to). Add an input or a pipeline source.
// With input, we get:Error: Validation failed with the following errors: Source actions may only occur in first stage
input: cdkInput,
commands: ["yarn install --frozen-lockfile", "npx cdk synth"],
primaryOutputDirectory: 'cdk.out'
}),
});
// add any additional test step here, they will run parallels in waves
cdkPipeline.addWave('test', {post: [provideUnitTestStep(this, 'unitTest')]});
// add a manual approve step if needed.
cdkPipeline.addWave('promotion', {post: [new ManualApprovalStep('PromoteToUat')]});
// Produces: Stage 'PreProd' must have at least one action
// pipeline.addStage(new MyApplication(this, "PreProd"));
// Produces: The given Stage construct ('CdkTestStack/PreProd') should contain at least one Stack
cdkPipeline.addStage(new MyApplication(this, "PreProd"));
}
}
class MyApplication extends cdk.Stage {
constructor(scope: cdk.Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props);
console.log("Nothing to deploy");
}
}
What's noticing is that you might need to covert the way you write your Codebuild action to the new cdk CodeBuildStep. A sample unit test step could is like below:
const provideUnitTestStep = (
id: string
): cdkpipeline.CodeBuildStep => {
const props: CodeBuildStepProps = {
partialBuildSpec: codebuild.BuildSpec.fromObject({
version: '0.2',
env: {
variables: {
DEFINE_VARIBLES: 'someVariables'
}
},
phases: {
install: {
commands: [
'install some dependencies',
]
},
build: {
commands: [
'run some test!'
]
}
}
}),
commands: [],
buildEnvironment: {
buildImage: codebuild.LinuxBuildImage.STANDARD_5_0
}
};
return new cdkpipeline.CodeBuildStep(`${id}`, props);
};
It's not so trivial(and straight forward enough) to retrive the underline CodeBuild project Role, you will need to pass in rolePolicyStatements property in the CodeBuildStep props to grant extra permission needed for your test.
First of all, the error Pipeline must have at least two stages is correct.
You only got the GitHub checkout/clone command as a single stage.
For a second stage, you could use a CodeBuild project to compile/lint/unit test... as you mentioned.
However, what would you like to do with your compiled artifacts then?
Build containers to deploy them later?
If so, there are better ways with CDK of doing this (DockerImageAsset).
This also could save up your preexisting pipeline and you can use the CDK Pipeline directly.
Could you please try to set the property restartExecutionOnUpdate: true,
of your regular Pipeline, like in my following snippet?
const pipeline = new codepipeline.Pipeline(this, "Pipeline", {
restartExecutionOnUpdate: true,
stages: [
{
actions: [sourceAction],
stageName: "GitSource",
},
],
});
This is needed for the self-mutation capability of the CDK pipeline.
This happened to me when I was creating a pipeline in a stack without specifically defined account and region.
Check if you have env like this:
new CdkStack(app, 'CdkStack', {
env: {
account: awsProdAccount,
region: defaultRegion,
}
});
I want to run AWS CDK synthesis from Git repository using AWS CodeBuild - i.e. if I update the CDK app code in the repo I want CloudFormation stacks to be updated automatically. What are the best practices for setting up build role permissions?
For a GitHub repository, your CodeBuild role doesn't need additional permissions but it should have access to an oauthToken to access GitHub.
For a CodeCommit repository, create or import a codecommit.Repository object and use a CodeCommitSource object for your source parameter, and the build role permissions will be set up automatically (in particular, the permissions that will be added will be to codecommit:GitPull from the indicated repository).
See here.
You might also be interested in CDK's app-delivery package. It doesn't just create a CodeBuild project though, it uses CodePipeline to fetch, build and deploy a CDK application, so it might be more than you are looking for.
AWS released a month ago a new class to the CDK suite called pipelines that includes several utilities to ease the job of setting up self modifying pipelines. In addition, there's codepipeline-actions that includes constructs to hook your pipeline to CodeCommit, GitHub, BitBucket, etc...
Here's a complete example (verbatim from the linked blog post), using github as a source, that deploys a lambda through CodePipeline:
Create a stage with your stack
import { CfnOutput, Construct, Stage, StageProps } from '#aws-cdk/core';
import { CdkpipelinesDemoStack } from './cdkpipelines-demo-stack';
/**
* Deployable unit of web service app
*/
export class CdkpipelinesDemoStage extends Stage {
public readonly urlOutput: CfnOutput;
constructor(scope: Construct, id: string, props?: StageProps) {
super(scope, id, props);
const service = new CdkpipelinesDemoStack(this, 'WebService');
// Expose CdkpipelinesDemoStack's output one level higher
this.urlOutput = service.urlOutput;
}
}
Create a stack with your pipeline
import * as codepipeline from '#aws-cdk/aws-codepipeline';
import * as codepipeline_actions from '#aws-cdk/aws-codepipeline-actions';
import { Construct, SecretValue, Stack, StackProps } from '#aws-cdk/core';
import { CdkPipeline, SimpleSynthAction } from "#aws-cdk/pipelines";
/**
* The stack that defines the application pipeline
*/
export class CdkpipelinesDemoPipelineStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const sourceArtifact = new codepipeline.Artifact();
const cloudAssemblyArtifact = new codepipeline.Artifact();
const pipeline = new CdkPipeline(this, 'Pipeline', {
// The pipeline name
pipelineName: 'MyServicePipeline',
cloudAssemblyArtifact,
// Where the source can be found
sourceAction: new codepipeline_actions.GitHubSourceAction({
actionName: 'GitHub',
output: sourceArtifact,
oauthToken: SecretValue.secretsManager('github-token'),
owner: 'OWNER',
repo: 'REPO',
}),
// How it will be built and synthesized
synthAction: SimpleSynthAction.standardNpmSynth({
sourceArtifact,
cloudAssemblyArtifact,
// We need a build step to compile the TypeScript Lambda
buildCommand: 'npm run build'
}),
});
// This is where we add the application stages
// ...
}
}