Namespace Amazon.CDK.Pipelines
CDK Pipelines
A construct library for painless Continuous Delivery of CDK applications.
CDK Pipelines is an opinionated construct library. It is purpose-built to deploy one or more copies of your CDK applications using CloudFormation with a minimal amount of effort on your part. It is not intended to support arbitrary deployment pipelines, and very specifically it is not built to use CodeDeploy to deploy applications to instances, or deploy your custom-built ECR images to an ECS cluster directly: use CDK file assets with CloudFormation Init for instances, or CDK container assets for ECS clusters instead.
Give the CDK Pipelines way of doing things a shot first: you might find it does
everything you need. If you need more control, or if you need v2
support from
aws-codepipeline
, we recommend you drop down to using the aws-codepipeline
construct library directly.
This module contains two sets of APIs: an <strong>original</strong> and a <strong>modern</strong> version of
CDK Pipelines. The <em>modern</em> API has been updated to be easier to work with and
customize, and will be the preferred API going forward. The <em>original</em> version
of the API is still available for backwards compatibility, but we recommend migrating
to the new version if possible.
Compared to the original API, the modern API: has more sensible defaults; is more flexible; supports parallel deployments; supports multiple synth inputs; allows more control of CodeBuild project generation; supports deployment engines other than CodePipeline.
The README for the original API, as well as a migration guide, can be found in our GitHub repository.
At a glance
Deploying your application continuously starts by defining a
MyApplicationStage
, a subclass of Stage
that contains the stacks that make
up a single copy of your application.
You then define a Pipeline
, instantiate as many instances of
MyApplicationStage
as you want for your test and production environments, with
different parameters for each, and calling pipeline.addStage()
for each of
them. You can deploy to the same account and Region, or to a different one,
with the same amount of code. The CDK Pipelines library takes care of the
details.
CDK Pipelines supports multiple deployment engines (see
Using a different deployment engine),
and comes with a deployment engine that deploys CDK apps using AWS CodePipeline.
To use the CodePipeline engine, define a CodePipeline
construct. The following
example creates a CodePipeline that deploys an application from GitHub:
/** The stacks for our app are minimally defined here. The internals of these
* stacks aren't important, except that DatabaseStack exposes an attribute
* "table" for a database table it defines, and ComputeStack accepts a reference
* to this table in its properties.
*/
class DatabaseStack : Stack
{
public TableV2 Table { get; }
public DatabaseStack(Construct scope, string id) : base(scope, id)
{
Table = new TableV2(this, "Table", new TablePropsV2 {
PartitionKey = new Attribute { Name = "id", Type = AttributeType.STRING }
});
}
}
class ComputeProps
{
public TableV2 Table { get; set; }
}
class ComputeStack : Stack
{
public ComputeStack(Construct scope, string id, ComputeProps props) : base(scope, id)
{
}
}
/**
* Stack to hold the pipeline
*/
class MyPipelineStack : Stack
{
public MyPipelineStack(Construct scope, string id, StackProps? props=null) : base(scope, id, props)
{
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
// Use a connection created using the AWS console to authenticate to GitHub
// Other sources are available.
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
})
});
// 'MyApplication' is defined below. Call `addStage` as many times as
// necessary with any account and region (may be different from the
// pipeline's).
pipeline.AddStage(new MyApplication(this, "Prod", new StageProps {
Env = new Environment {
Account = "123456789012",
Region = "eu-west-1"
}
}));
}
}
/**
* Your application
*
* May consist of one or more Stacks (here, two)
*
* By declaring our DatabaseStack and our ComputeStack inside a Stage,
* we make sure they are deployed together, or not at all.
*/
class MyApplication : Stage
{
public MyApplication(Construct scope, string id, StageProps? props=null) : base(scope, id, props)
{
var dbStack = new DatabaseStack(this, "Database");
new ComputeStack(this, "Compute", new ComputeProps {
Table = dbStack.Table
});
}
}
// In your main file
// In your main file
new MyPipelineStack(this, "PipelineStack", new StackProps {
Env = new Environment {
Account = "123456789012",
Region = "eu-west-1"
}
});
The pipeline is self-mutating, which means that if you add new
application stages in the source code, or new stacks to MyApplication
, the
pipeline will automatically reconfigure itself to deploy those new stages and
stacks.
(Note that you have to bootstrap all environments before the above code will work, and switch on "Modern synthesis" if you are using CDKv1. See the section CDK Environment Bootstrapping below for more information).
Provisioning the pipeline
To provision the pipeline you have defined, make sure the target environment
has been bootstrapped (see below), and then execute deploying the
PipelineStack
once. Afterwards, the pipeline will keep itself up-to-date.
Important: be sure to git commit
and git push
before deploying the
Pipeline stack using cdk deploy
!
The reason is that the pipeline will start deploying and self-mutating right away based on the sources in the repository, so the sources it finds in there should be the ones you want it to find.
Run the following commands to get the pipeline going:
$ git commit -a
$ git push
$ cdk deploy PipelineStack
Administrative permissions to the account are only necessary up until this point. We recommend you remove access to these credentials after doing this.
Working on the pipeline
The self-mutation feature of the Pipeline might at times get in the way
of the pipeline development workflow. Each change to the pipeline must be pushed
to git, otherwise, after the pipeline was updated using cdk deploy
, it will
automatically revert to the state found in git.
To make the development more convenient, the self-mutation feature can be turned
off temporarily, by passing selfMutation: false
property, example:
// Modern API
var modernPipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
SelfMutation = false,
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
})
});
// Original API
var cloudAssemblyArtifact = new Artifact();
var originalPipeline = new CdkPipeline(this, "Pipeline", new CdkPipelineProps {
SelfMutating = false,
CloudAssemblyArtifact = cloudAssemblyArtifact
});
Defining the pipeline
This section of the documentation describes the AWS CodePipeline engine, which comes with this library. If you want to use a different deployment engine, read the section Using a different deployment engine below.
Synth and sources
To define a pipeline, instantiate a CodePipeline
construct from the
aws-cdk-lib/pipelines
module. It takes one argument, a synth
step, which is
expected to produce the CDK Cloud Assembly as its single output (the contents of
the cdk.out
directory after running cdk synth
). "Steps" are arbitrary
actions in the pipeline, typically used to run scripts or commands.
For the synth, use a ShellStep
and specify the commands necessary to install
dependencies, the CDK CLI, build your project and run cdk synth
; the specific
commands required will depend on the programming language you are using. For a
typical NPM-based project, the synth will look like this:
IFileSetProducer source;
// the repository source
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = source,
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
})
});
The pipeline assumes that your ShellStep
will produce a cdk.out
directory in the root, containing the CDK cloud assembly. If your
CDK project lives in a subdirectory, be sure to adjust the
primaryOutputDirectory
to match:
IFileSetProducer source;
// the repository source
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = source,
Commands = new [] { "cd mysubdir", "npm ci", "npm run build", "npx cdk synth" },
PrimaryOutputDirectory = "mysubdir/cdk.out"
})
});
The underlying aws-cdk-lib/aws-codepipeline.Pipeline
construct will be produced
when app.synth()
is called. You can also force it to be produced
earlier by calling pipeline.buildPipeline()
. After you've called
that method, you can inspect the constructs that were produced by
accessing the properties of the pipeline
object.
Commands for other languages and package managers
The commands you pass to new ShellStep
will be very similar to the commands
you run on your own workstation to install dependencies and synth your CDK
project. Here are some (non-exhaustive) examples for what those commands might
look like in a number of different situations.
For Yarn, the install commands are different:
IFileSetProducer source;
// the repository source
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = source,
Commands = new [] { "yarn install --frozen-lockfile", "yarn build", "npx cdk synth" }
})
});
For Python projects, remember to install the CDK CLI globally (as
there is no package.json
to automatically install it for you):
IFileSetProducer source;
// the repository source
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = source,
Commands = new [] { "pip install -r requirements.txt", "npm install -g aws-cdk", "cdk synth" }
})
});
For Java projects, remember to install the CDK CLI globally (as
there is no package.json
to automatically install it for you),
and the Maven compilation step is automatically executed for you
as you run cdk synth
:
IFileSetProducer source;
// the repository source
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = source,
Commands = new [] { "npm install -g aws-cdk", "cdk synth" }
})
});
You can adapt these examples to your own situation.
Migrating from buildspec.yml files
You may currently have the build instructions for your CodeBuild Projects in a
buildspec.yml
file in your source repository. In addition to your build
commands, the CodeBuild Project's buildspec also controls some information that
CDK Pipelines manages for you, like artifact identifiers, input artifact
locations, Docker authorization, and exported variables.
Since there is no way in general for CDK Pipelines to modify the file in your
resource repository, CDK Pipelines configures the BuildSpec directly on the
CodeBuild Project, instead of loading it from the buildspec.yml
file.
This requires a pipeline self-mutation to update.
To avoid this, put your build instructions in a separate script, for example
build.sh
, and call that script from the build commands
array:
IFileSetProducer source;
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = source,
Commands = new [] { "./build.sh" }
})
});
Doing so keeps your exact build instructions in sync with your source code in the source repository where it belongs, and provides a convenient build script for developers at the same time.
CodePipeline Sources
In CodePipeline, Sources define where the source of your application lives.
When a change to the source is detected, the pipeline will start executing.
Source objects can be created by factory methods on the CodePipelineSource
class:
GitHub, GitHub Enterprise, BitBucket using a connection
The recommended way of connecting to GitHub or BitBucket is by using a connection. You will first use the AWS Console to authenticate to the source control provider, and then use the connection ARN in your pipeline definition:
CodePipelineSource.Connection("org/repo", "branch", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
});
GitHub using OAuth
You can also authenticate to GitHub using a personal access token. This expects that you've created a personal access token and stored it in Secrets Manager. By default, the source object will look for a secret named github-token, but you can change the name. The token should have the repo and admin:repo_hook scopes.
CodePipelineSource.GitHub("org/repo", "branch", new GitHubSourceOptions {
// This is optional
Authentication = SecretValue.SecretsManager("my-token")
});
CodeCommit
You can use a CodeCommit repository as the source. Either create or import
that the CodeCommit repository and then use CodePipelineSource.codeCommit
to reference it:
var repository = Repository.FromRepositoryName(this, "Repository", "my-repository");
CodePipelineSource.CodeCommit(repository, "main");
S3
You can use a zip file in S3 as the source of the pipeline. The pipeline will be triggered every time the file in S3 is changed:
var bucket = Bucket.FromBucketName(this, "Bucket", "my-bucket");
CodePipelineSource.S3(bucket, "my/source.zip");
ECR
You can use a Docker image in ECR as the source of the pipeline. The pipeline will be triggered every time an image is pushed to ECR:
var repository = new Repository(this, "Repository");
CodePipelineSource.Ecr(repository);
Additional inputs
ShellStep
allows passing in more than one input: additional
inputs will be placed in the directories you specify. Any step that produces an
output file set can be used as an input, such as a CodePipelineSource
, but
also other ShellStep
:
var prebuild = new ShellStep("Prebuild", new ShellStepProps {
Input = CodePipelineSource.GitHub("myorg/repo1", "main"),
PrimaryOutputDirectory = "./build",
Commands = new [] { "./build.sh" }
});
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineSource.GitHub("myorg/repo2", "main"),
AdditionalInputs = new Dictionary<string, IFileSetProducer> {
{ "subdir", CodePipelineSource.GitHub("myorg/repo3", "main") },
{ "../siblingdir", prebuild }
},
Commands = new [] { "./build.sh" }
})
});
CDK application deployments
After you have defined the pipeline and the synth
step, you can add one or
more CDK Stages
which will be deployed to their target environments. To do
so, call pipeline.addStage()
on the Stage object:
CodePipeline pipeline;
// Do this as many times as necessary with any account and region
// Account and region may different from the pipeline's.
pipeline.AddStage(new MyApplicationStage(this, "Prod", new StageProps {
Env = new Environment {
Account = "123456789012",
Region = "eu-west-1"
}
}));
CDK Pipelines will automatically discover all Stacks
in the given Stage
object, determine their dependency order, and add appropriate actions to the
pipeline to publish the assets referenced in those stacks and deploy the stacks
in the right order.
If the Stacks
are targeted at an environment in a different AWS account or
Region and that environment has been
bootstrapped
, CDK Pipelines will transparently make sure the IAM roles are set up
correctly and any requisite replication Buckets are created.
Deploying in parallel
By default, all applications added to CDK Pipelines by calling addStage()
will
be deployed in sequence, one after the other. If you have a lot of stages, you can
speed up the pipeline by choosing to deploy some stages in parallel. You do this
by calling addWave()
instead of addStage()
: a wave is a set of stages that
are all deployed in parallel instead of sequentially. Waves themselves are still
deployed in sequence. For example, the following will deploy two copies of your
application to eu-west-1
and eu-central-1
in parallel:
CodePipeline pipeline;
var europeWave = pipeline.AddWave("Europe");
europeWave.AddStage(new MyApplicationStage(this, "Ireland", new StageProps {
Env = new Environment { Region = "eu-west-1" }
}));
europeWave.AddStage(new MyApplicationStage(this, "Germany", new StageProps {
Env = new Environment { Region = "eu-central-1" }
}));
Deploying to other accounts / encrypting the Artifact Bucket
CDK Pipelines can transparently deploy to other Regions and other accounts
(provided those target environments have been
bootstrapped).
However, deploying to another account requires one additional piece of
configuration: you need to enable crossAccountKeys: true
when creating the
pipeline.
This will encrypt the artifact bucket(s), but incurs a cost for maintaining the KMS key.
You may also wish to enable automatic key rotation for the created KMS key.
Example:
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
// Encrypt artifacts, required for cross-account deployments
CrossAccountKeys = true,
EnableKeyRotation = true, // optional
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
})
});
Deploying without change sets
Deployment is done by default with CodePipeline
engine using change sets,
i.e. to first create a change set and then execute it. This allows you to inject
steps that inspect the change set and approve or reject it, but failed deployments
are not retryable and creation of the change set costs time.
The creation of change sets can be switched off by setting useChangeSets: false
:
ShellStep synth;
class PipelineStack : Stack
{
public PipelineStack(Construct scope, string id, StackProps? props=null) : base(scope, id, props)
{
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = synth,
// Disable change set creation and make deployments in pipeline as single step
UseChangeSets = false
});
}
}
Validation
Every addStage()
and addWave()
command takes additional options. As part of these options,
you can specify pre
and post
steps, which are arbitrary steps that run before or after
the contents of the stage or wave, respectively. You can use these to add validations like
manual or automated gates to your pipeline. We recommend putting manual approval gates in the set of pre
steps, and automated approval gates in
the set of post
steps.
The following example shows both an automated approval in the form of a ShellStep
, and
a manual approval in the form of a ManualApprovalStep
added to the pipeline. Both must
pass in order to promote from the PreProd
to the Prod
environment:
CodePipeline pipeline;
var preprod = new MyApplicationStage(this, "PreProd");
var prod = new MyApplicationStage(this, "Prod");
pipeline.AddStage(preprod, new AddStageOpts {
Post = new [] {
new ShellStep("Validate Endpoint", new ShellStepProps {
Commands = new [] { "curl -Ssf https://my.webservice.com/" }
}) }
});
pipeline.AddStage(prod, new AddStageOpts {
Pre = new [] {
new ManualApprovalStep("PromoteToProd") }
});
You can also specify steps to be executed at the stack level. To achieve this, you can specify the stack and step via the stackSteps
property:
CodePipeline pipeline;
class MyStacksStage : Stage
{
public Stack Stack1 { get; }
public Stack Stack2 { get; }
public MyStacksStage(Construct scope, string id, StageProps? props=null) : base(scope, id, props)
{
Stack1 = new Stack(this, "stack1");
Stack2 = new Stack(this, "stack2");
}
}
var prod = new MyStacksStage(this, "Prod");
pipeline.AddStage(prod, new AddStageOpts {
StackSteps = new [] { new StackSteps {
Stack = prod.Stack1,
Pre = new [] { new ManualApprovalStep("Pre-Stack Check") }, // Executed before stack is prepared
ChangeSet = new [] { new ManualApprovalStep("ChangeSet Approval") }, // Executed after stack is prepared but before the stack is deployed
Post = new [] { new ManualApprovalStep("Post-Deploy Check") }
}, new StackSteps {
Stack = prod.Stack2,
Post = new [] { new ManualApprovalStep("Post-Deploy Check") }
} }
});
If you specify multiple steps, they will execute in parallel by default. You can add dependencies between them
to if you wish to specify an order. To add a dependency, call step.addStepDependency()
:
var firstStep = new ManualApprovalStep("A");
var secondStep = new ManualApprovalStep("B");
secondStep.AddStepDependency(firstStep);
For convenience, Step.sequence()
will take an array of steps and dependencies between adjacent steps,
so that the whole list executes in order:
// Step A will depend on step B and step B will depend on step C
var orderedSteps = Step.Sequence(new [] {
new ManualApprovalStep("A"),
new ManualApprovalStep("B"),
new ManualApprovalStep("C") });
Using CloudFormation Stack Outputs in approvals
Because many CloudFormation deployments result in the generation of resources with unpredictable names, validations have support for reading back CloudFormation Outputs after a deployment. This makes it possible to pass (for example) the generated URL of a load balancer to the test set.
To use Stack Outputs, expose the CfnOutput
object you're interested in, and
pass it to envFromCfnOutputs
of the ShellStep
:
CodePipeline pipeline;
class MyOutputStage : Stage
{
public CfnOutput LoadBalancerAddress { get; }
public MyOutputStage(Construct scope, string id, StageProps? props=null) : base(scope, id, props)
{
LoadBalancerAddress = new CfnOutput(this, "Output", new CfnOutputProps { Value = "value" });
}
}
var lbApp = new MyOutputStage(this, "MyApp");
pipeline.AddStage(lbApp, new AddStageOpts {
Post = new [] {
new ShellStep("HitEndpoint", new ShellStepProps {
EnvFromCfnOutputs = new Dictionary<string, CfnOutput> {
// Make the load balancer address available as $URL inside the commands
{ "URL", lbApp.LoadBalancerAddress }
},
Commands = new [] { "curl -Ssf $URL" }
}) }
});
Running scripts compiled during the synth step
As part of a validation, you probably want to run a test suite that's more
elaborate than what can be expressed in a couple of lines of shell script.
You can bring additional files into the shell script validation by supplying
the input
or additionalInputs
property of ShellStep
. The input can
be produced by the Synth
step, or come from a source or any other build
step.
Here's an example that captures an additional output directory in the synth step and runs tests from there:
ShellStep synth;
var stage = new MyApplicationStage(this, "MyApplication");
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps { Synth = synth });
pipeline.AddStage(stage, new AddStageOpts {
Post = new [] {
new ShellStep("Approve", new ShellStepProps {
// Use the contents of the 'integ' directory from the synth step as the input
Input = synth.AddOutputDirectory("integ"),
Commands = new [] { "cd integ && ./run.sh" }
}) }
});
Customizing CodeBuild Projects
CDK pipelines will generate CodeBuild projects for each ShellStep
you use, and it
will also generate CodeBuild projects to publish assets and perform the self-mutation
of the pipeline. To control the various aspects of the CodeBuild projects that get
generated, use a CodeBuildStep
instead of a ShellStep
. This class has a number
of properties that allow you to customize various aspects of the projects:
Vpc vpc;
SecurityGroup mySecurityGroup;
new CodeBuildStep("Synth", new CodeBuildStepProps {
// ...standard ShellStep props...
Commands = new [] { },
Env = new Dictionary<string, object> { },
// If you are using a CodeBuildStep explicitly, set the 'cdk.out' directory
// to be the synth step's output.
PrimaryOutputDirectory = "cdk.out",
// Control the name of the project
ProjectName = "MyProject",
// Control parts of the BuildSpec other than the regular 'build' and 'install' commands
PartialBuildSpec = BuildSpec.FromObject(new Dictionary<string, object> {
{ "version", "0.2" }
}),
// Control the build environment
BuildEnvironment = new BuildEnvironment {
ComputeType = ComputeType.LARGE,
Privileged = true
},
Timeout = Duration.Minutes(90),
FileSystemLocations = new [] { FileSystemLocation.Efs(new EfsFileSystemLocationProps {
Identifier = "myidentifier2",
Location = "myclodation.mydnsroot.com:/loc",
MountPoint = "/media",
MountOptions = "opts"
}) },
// Control Elastic Network Interface creation
Vpc = vpc,
SubnetSelection = new SubnetSelection { SubnetType = SubnetType.PRIVATE_WITH_EGRESS },
SecurityGroups = new [] { mySecurityGroup },
// Control caching
Cache = Cache.Bucket(new Bucket(this, "Cache")),
// Additional policy statements for the execution role
RolePolicyStatements = new [] {
new PolicyStatement(new PolicyStatementProps { }) }
});
You can also configure defaults for all CodeBuild projects by passing codeBuildDefaults
,
or just for the synth, asset publishing, and self-mutation projects by passing synthCodeBuildDefaults
,
assetPublishingCodeBuildDefaults
, or selfMutationCodeBuildDefaults
:
using Amazon.CDK.AWS.Logs;
Vpc vpc;
SecurityGroup mySecurityGroup;
new CodePipeline(this, "Pipeline", new CodePipelineProps {
// Standard CodePipeline properties
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
}),
// Defaults for all CodeBuild projects
CodeBuildDefaults = new CodeBuildOptions {
// Prepend commands and configuration to all projects
PartialBuildSpec = BuildSpec.FromObject(new Dictionary<string, object> {
{ "version", "0.2" }
}),
// Control the build environment
BuildEnvironment = new BuildEnvironment {
ComputeType = ComputeType.LARGE
},
// Control Elastic Network Interface creation
Vpc = vpc,
SubnetSelection = new SubnetSelection { SubnetType = SubnetType.PRIVATE_WITH_EGRESS },
SecurityGroups = new [] { mySecurityGroup },
// Additional policy statements for the execution role
RolePolicy = new [] {
new PolicyStatement(new PolicyStatementProps { }) },
// Information about logs
Logging = new LoggingOptions {
CloudWatch = new CloudWatchLoggingOptions {
LogGroup = new LogGroup(this, "MyLogGroup")
},
S3 = new S3LoggingOptions {
Bucket = new Bucket(this, "LogBucket")
}
}
},
SynthCodeBuildDefaults = new CodeBuildOptions { },
AssetPublishingCodeBuildDefaults = new CodeBuildOptions { },
SelfMutationCodeBuildDefaults = new CodeBuildOptions { }
});
Arbitrary CodePipeline actions
If you want to add a type of CodePipeline action to the CDK Pipeline that
doesn't have a matching class yet, you can define your own step class that extends
Step
and implements ICodePipelineActionFactory
.
Here's an example that adds a Jenkins step:
class MyJenkinsStep : Step, ICodePipelineActionFactory
{
public MyJenkinsStep(JenkinsProvider provider, FileSet input) : base("MyJenkinsStep")
{
// This is necessary if your step accepts parameters, like environment variables,
// that may contain outputs from other steps. It doesn't matter what the
// structure is, as long as it contains the values that may contain outputs.
DiscoverReferencedOutputs(new Dictionary<string, IDictionary<string, object>> {
{ "env", new Struct { } }
});
}
public CodePipelineActionFactoryResult ProduceAction(IStage stage, ProduceActionOptions options)
{
// This is where you control what type of Action gets added to the
// CodePipeline
stage.AddAction(new JenkinsAction(new JenkinsActionProps {
// Copy 'actionName' and 'runOrder' from the options
ActionName = options.ActionName,
RunOrder = options.RunOrder,
// Jenkins-specific configuration
Type = JenkinsActionType.TEST,
JenkinsProvider = Provider,
ProjectName = "MyJenkinsProject",
// Translate the FileSet into a codepipeline.Artifact
Inputs = new [] { options.Artifacts.ToCodePipeline(Input) }
}));
return new CodePipelineActionFactoryResult { RunOrdersConsumed = 1 };
}
}
Another example, adding a lambda step referencing outputs from a stack:
class MyLambdaStep : Step, ICodePipelineActionFactory
{
private StackOutputReference stackOutputReference;
public MyLambdaStep(Function fn, CfnOutput stackOutput) : base("MyLambdaStep")
{
stackOutputReference = StackOutputReference.FromCfnOutput(stackOutput);
}
public CodePipelineActionFactoryResult ProduceAction(IStage stage, ProduceActionOptions options)
{
stage.AddAction(new LambdaInvokeAction(new LambdaInvokeActionProps {
ActionName = options.ActionName,
RunOrder = options.RunOrder,
// Map the reference to the variable name the CDK has generated for you.
UserParameters = new Dictionary<string, object> { { "stackOutput", options.StackOutputsMap.ToCodePipeline(stackOutputReference) } },
Lambda = Fn
}));
return new CodePipelineActionFactoryResult { RunOrdersConsumed = 1 };
}public get consumedStackOutputs(): pipelines.StackOutputReference[] {
return [this.stackOutputReference];
}
}
Using an existing AWS Codepipeline
If you wish to use an existing CodePipeline.Pipeline
while using the modern API's
methods and classes, you can pass in the existing CodePipeline.Pipeline
to be built upon
instead of having the pipelines.CodePipeline
construct create a new CodePipeline.Pipeline
.
This also gives you more direct control over the underlying CodePipeline.Pipeline
construct
if the way the modern API creates it doesn't allow for desired configurations. Use CodePipelineFileset
to convert CodePipeline artifacts into CDK Pipelines file sets,
that can be used everywhere a file set or file set producer is expected.
Here's an example of passing in an existing pipeline and using a source that's already in the pipeline:
Pipeline codePipeline;
var sourceArtifact = new Artifact("MySourceArtifact");
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
CodePipeline = codePipeline,
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineFileSet.FromArtifact(sourceArtifact),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
})
});
If your existing pipeline already provides a synth step, pass the existing
artifact in place of the synth
step:
Pipeline codePipeline;
var buildArtifact = new Artifact("MyBuildArtifact");
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
CodePipeline = codePipeline,
Synth = CodePipelineFileSet.FromArtifact(buildArtifact)
});
Note that if you provide an existing pipeline, you cannot provide values for
pipelineName
, crossAccountKeys
, reuseCrossRegionSupportStacks
, or role
because those values are passed in directly to the underlying codepipeline.Pipeline
.
Using Docker in the pipeline
Docker can be used in 3 different places in the pipeline:
For the first case, you don't need to do anything special. For the other two cases, you need to make sure that privileged mode is enabled on the correct CodeBuild projects, so that Docker can run correctly. The follow sections describe how to do that.
You may also need to authenticate to Docker registries to avoid being throttled. See the section Authenticating to Docker registries below for information on how to do that.
Using Docker image assets in the pipeline
If your PipelineStack
is using Docker image assets (as opposed to the application
stacks the pipeline is deploying), for example by the use of LinuxBuildImage.fromAsset()
,
you need to pass dockerEnabledForSelfMutation: true
to the pipeline. For example:
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
}),
// Turn this on because the pipeline uses Docker image assets
DockerEnabledForSelfMutation = true
});
pipeline.AddWave("MyWave", new WaveOptions {
Post = new [] {
new CodeBuildStep("RunApproval", new CodeBuildStepProps {
Commands = new [] { "command-from-image" },
BuildEnvironment = new BuildEnvironment {
// The user of a Docker image asset in the pipeline requires turning on
// 'dockerEnabledForSelfMutation'.
BuildImage = LinuxBuildImage.FromAsset(this, "Image", new DockerImageAssetProps {
Directory = "./docker-image"
})
}
}) }
});
Important: You must turn on the dockerEnabledForSelfMutation
flag,
commit and allow the pipeline to self-update before adding the actual
Docker asset.
Using bundled file assets
If you are using asset bundling anywhere (such as automatically done for you
if you add a construct like aws-cdk-lib/aws-lambda-nodejs
), you need to pass
dockerEnabledForSynth: true
to the pipeline. For example:
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
}),
// Turn this on because the application uses bundled file assets
DockerEnabledForSynth = true
});
Important: You must turn on the dockerEnabledForSynth
flag,
commit and allow the pipeline to self-update before adding the actual
Docker asset.
Authenticating to Docker registries
You can specify credentials to use for authenticating to Docker registries as part of the pipeline definition. This can be useful if any Docker image assets — in the pipeline or any of the application stages — require authentication, either due to being in a different environment (e.g., ECR repo) or to avoid throttling (e.g., DockerHub).
var dockerHubSecret = Secret.FromSecretCompleteArn(this, "DHSecret", "arn:aws:...");
var customRegSecret = Secret.FromSecretCompleteArn(this, "CRSecret", "arn:aws:...");
var repo1 = Repository.FromRepositoryArn(this, "Repo", "arn:aws:ecr:eu-west-1:0123456789012:repository/Repo1");
var repo2 = Repository.FromRepositoryArn(this, "Repo", "arn:aws:ecr:eu-west-1:0123456789012:repository/Repo2");
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
DockerCredentials = new [] { DockerCredential.DockerHub(dockerHubSecret), DockerCredential.CustomRegistry("dockerregistry.example.com", customRegSecret), DockerCredential.Ecr(new [] { repo1, repo2 }) },
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
})
});
For authenticating to Docker registries that require a username and password combination
(like DockerHub), create a Secrets Manager Secret with fields named username
and secret
, and import it (the field names change be customized).
Authentication to ECR repositories is done using the execution role of the relevant CodeBuild job. Both types of credentials can be provided with an optional role to assume before requesting the credentials.
By default, the Docker credentials provided to the pipeline will be available to
the Synth, Self-Update, and Asset Publishing actions within the
*pipeline. The scope of the credentials can be limited via the DockerCredentialUsage
option.
var dockerHubSecret = Secret.FromSecretCompleteArn(this, "DHSecret", "arn:aws:...");
// Only the image asset publishing actions will be granted read access to the secret.
var creds = DockerCredential.DockerHub(dockerHubSecret, new ExternalDockerCredentialOptions { Usages = new [] { DockerCredentialUsage.ASSET_PUBLISHING } });
CDK Environment Bootstrapping
An environment is an (account, region) pair where you want to deploy a CDK stack (see Environments in the CDK Developer Guide). In a Continuous Deployment pipeline, there are at least two environments involved: the environment where the pipeline is provisioned, and the environment where you want to deploy the application (or different stages of the application). These can be the same, though best practices recommend you isolate your different application stages from each other in different AWS accounts or regions.
Before you can provision the pipeline, you have to bootstrap the environment you want to create it in. If you are deploying your application to different environments, you also have to bootstrap those and be sure to add a trust relationship.
After you have bootstrapped an environment and created a pipeline that deploys to it, it's important that you don't delete the stack or change its Qualifier, or future deployments to this environment will fail. If you want to upgrade the bootstrap stack to a newer version, do that by updating it in-place.
This library requires the <em>modern</em> bootstrapping stack which has
been updated specifically to support cross-account continuous delivery.
If you are using CDKv2, you do not need to do anything else. Modern bootstrapping and modern stack synthesis (also known as "default stack synthesis") is the default.
If you are using CDKv1, you need to opt in to modern bootstrapping and
modern stack synthesis using a feature flag. Make sure cdk.json
includes:
{
"context": {
"@aws-cdk/core:newStyleStackSynthesis": true
}
}
And be sure to run cdk bootstrap
in the same directory as the cdk.json
file.
To bootstrap an environment for provisioning the pipeline:
$ npx cdk bootstrap \
[--profile admin-profile-1] \
--cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \
aws://111111111111/us-east-1
To bootstrap a different environment for deploying CDK applications into using
a pipeline in account 111111111111
:
$ npx cdk bootstrap \
[--profile admin-profile-2] \
--cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \
--trust 11111111111 \
aws://222222222222/us-east-2
If you only want to trust an account to do lookups (e.g, when your CDK application has a
Vpc.fromLookup()
call), use the option --trust-for-lookup
:
$ npx cdk bootstrap \
[--profile admin-profile-2] \
--cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \
--trust-for-lookup 11111111111 \
aws://222222222222/us-east-2
These command lines explained:
Be aware that anyone who has access to the trusted Accounts <strong>effectively has all
permissions conferred by the configured CloudFormation execution policies</strong>,
allowing them to do things like read arbitrary S3 buckets and create arbitrary
infrastructure in the bootstrapped account. Restrict the list of <code>--trust</code>ed Accounts,
or restrict the policies configured by <code>--cloudformation-execution-policies</code>.
Security tip: we recommend that you use administrative credentials to an account only to bootstrap it and provision the initial pipeline. Otherwise, access to administrative credentials should be dropped as soon as possible.
On the use of AdministratorAccess: The use of the AdministratorAccess
policy
ensures that your pipeline can deploy every type of AWS resource to your account.
Make sure you trust all the code and dependencies that make up your CDK app.
Check with the appropriate department within your organization to decide on the
proper policy to use.
If your policy includes permissions to create on attach permission to a role,
developers can escalate their privilege with more permissive permission.
Thus, we recommend implementing permissions boundary
in the CDK Execution role. To do this, you can bootstrap with the --template
option with
a customized template that contains a permission boundary.
Migrating from old bootstrap stack
The bootstrap stack is a CloudFormation stack in your account named CDKToolkit that provisions a set of resources required for the CDK to deploy into that environment.
The "new" bootstrap stack (obtained by running cdk bootstrap
with
CDK_NEW_BOOTSTRAP=1
) is slightly more elaborate than the "old" stack. It
contains:
It is possible and safe to migrate from the old bootstrap stack to the new bootstrap stack. This will create a new S3 file asset bucket in your account and orphan the old bucket. You should manually delete the orphaned bucket after you are sure you have redeployed all CDK applications and there are no more references to the old asset bucket.
Considerations around Running at Scale
If you are planning to run pipelines for more than a hundred repos deploying across multiple regions, then you will want to consider reusing both artifacts buckets and cross-region replication buckets.
In a situation like this, you will want to have a separate CDK app / dedicated repo which creates and managed the buckets which will be shared by the pipelines of all your other apps. Note that this app must NOT be using the shared buckets because of chicken & egg issues.
The following code assumes you have created and are managing your buckets in the aforementioned separate cdk repo and are just importing them for use in one of your (many) pipelines.
string sharedXRegionUsWest1BucketArn;
string sharedXRegionUsWest1KeyArn;
string sharedXRegionUsWest2BucketArn;
string sharedXRegionUsWest2KeyArn;
var usWest1Bucket = Bucket.FromBucketAttributes(scope, "UsEast1Bucket", new BucketAttributes {
BucketArn = sharedXRegionUsWest1BucketArn,
EncryptionKey = Key.FromKeyArn(scope, "UsEast1BucketKeyArn", sharedXRegionUsWest1BucketArn)
});
var usWest2Bucket = Bucket.FromBucketAttributes(scope, "UsWest2Bucket", new BucketAttributes {
BucketArn = sharedXRegionUsWest2BucketArn,
EncryptionKey = Key.FromKeyArn(scope, "UsWest2BucketKeyArn", sharedXRegionUsWest2KeyArn)
});
IDictionary<string, IBucket> crossRegionReplicationBuckets = new Dictionary<string, IBucket> {
{ "us-west-1", usWest1Bucket },
{ "us-west-2", usWest2Bucket }
};
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new ShellStep("Synth", new ShellStepProps {
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "npm ci", "npm run build", "npx cdk synth" }
}), // Use shared buckets.
CrossRegionReplicationBuckets = crossRegionReplicationBuckets
});
Context Lookups
You might be using CDK constructs that need to look up runtime
context,
which is information from the target AWS Account and Region the CDK needs to
synthesize CloudFormation templates appropriate for that environment. Examples
of this kind of context lookups are the number of Availability Zones available
to you, a Route53 Hosted Zone ID, or the ID of an AMI in a given region. This
information is automatically looked up when you run cdk synth
.
By default, a cdk synth
performed in a pipeline will not have permissions
to perform these lookups, and the lookups will fail. This is by design.
Our recommended way of using lookups is by running cdk synth
on the
developer workstation and checking in the cdk.context.json
file, which
contains the results of the context lookups. This will make sure your
synthesized infrastructure is consistent and repeatable. If you do not commit
cdk.context.json
, the results of the lookups may suddenly be different in
unexpected ways, and even produce results that cannot be deployed or will cause
data loss. To give an account permissions to perform lookups against an
environment, without being able to deploy to it and make changes, run
cdk bootstrap --trust-for-lookup=<account>
.
If you want to use lookups directly from the pipeline, you either need to accept
the risk of nondeterminism, or make sure you save and load the
cdk.context.json
file somewhere between synth runs. Finally, you should
give the synth CodeBuild execution role permissions to assume the bootstrapped
lookup roles. As an example, doing so would look like this:
new CodePipeline(this, "Pipeline", new CodePipelineProps {
Synth = new CodeBuildStep("Synth", new CodeBuildStepProps {
Input = CodePipelineSource.Connection("my-org/my-app", "main", new ConnectionSourceOptions {
ConnectionArn = "arn:aws:codestar-connections:us-east-1:222222222222:connection/7d2469ff-514a-4e4f-9003-5ca4a43cdc41"
}),
Commands = new [] { "...", "npm ci", "npm run build", "npx cdk synth", "..." },
RolePolicyStatements = new [] {
new PolicyStatement(new PolicyStatementProps {
Actions = new [] { "sts:AssumeRole" },
Resources = new [] { "*" },
Conditions = new Dictionary<string, object> {
{ "StringEquals", new Dictionary<string, string> {
{ "iam:ResourceTag/aws-cdk:bootstrap-role", "lookup" }
} }
}
}) }
})
});
The above example requires that the target environments have all
been bootstrapped with bootstrap stack version 8
, released with
CDK CLI 1.114.0
.
Security Considerations
It's important to stay safe while employing Continuous Delivery. The CDK Pipelines library comes with secure defaults to the best of our ability, but by its very nature the library cannot take care of everything.
We therefore expect you to mind the following:
Confirm permissions broadening
To keep tabs on the security impact of changes going out through your pipeline, you can insert a security check before any stage deployment. This security check will check if the upcoming deployment would add any new IAM permissions or security group rules, and if so pause the pipeline and require you to confirm the changes.
The security check will appear as two distinct actions in your pipeline: first
a CodeBuild project that runs cdk diff
on the stage that's about to be deployed,
followed by a Manual Approval action that pauses the pipeline. If it so happens
that there no new IAM permissions or security group rules will be added by the deployment,
the manual approval step is automatically satisfied. The pipeline will look like this:
Pipeline
├── ...
├── MyApplicationStage
│ ├── MyApplicationSecurityCheck // Security Diff Action
│ ├── MyApplicationManualApproval // Manual Approval Action
│ ├── Stack.Prepare
│ └── Stack.Deploy
└── ...
You can insert the security check by using a ConfirmPermissionsBroadening
step:
CodePipeline pipeline;
var stage = new MyApplicationStage(this, "MyApplication");
pipeline.AddStage(stage, new AddStageOpts {
Pre = new [] {
new ConfirmPermissionsBroadening("Check", new PermissionsBroadeningCheckProps { Stage = stage }) }
});
To get notified when there is a change that needs your manual approval,
create an SNS Topic, subscribe your own email address, and pass it in as
as the notificationTopic
property:
CodePipeline pipeline;
var topic = new Topic(this, "SecurityChangesTopic");
topic.AddSubscription(new EmailSubscription("test@email.com"));
var stage = new MyApplicationStage(this, "MyApplication");
pipeline.AddStage(stage, new AddStageOpts {
Pre = new [] {
new ConfirmPermissionsBroadening("Check", new PermissionsBroadeningCheckProps {
Stage = stage,
NotificationTopic = topic
}) }
});
Note: Manual Approvals notifications only apply when an application has security check enabled.
Using a different deployment engine
CDK Pipelines supports multiple deployment engines, but this module vends a construct for only one such engine: AWS CodePipeline. It is also possible to use CDK Pipelines to build pipelines backed by other deployment engines.
Here is a list of CDK Libraries that integrate CDK Pipelines with alternative deployment engines:
Troubleshooting
Here are some common errors you may encounter while using this library.
Pipeline: Internal Failure
If you see the following error during deployment of your pipeline:
CREATE_FAILED | AWS::CodePipeline::Pipeline | Pipeline/Pipeline
Internal Failure
There's something wrong with your GitHub access token. It might be missing, or not have the right permissions to access the repository you're trying to access.
Key: Policy contains a statement with one or more invalid principals
If you see the following error during deployment of your pipeline:
CREATE_FAILED | AWS::KMS::Key | Pipeline/Pipeline/ArtifactsBucketEncryptionKey
Policy contains a statement with one or more invalid principals.
One of the target (account, region) environments has not been bootstrapped with the new bootstrap stack. Check your target environments and make sure they are all bootstrapped.
Message: no matching base directory path found for cdk.out
If you see this error during the Synth step, it means that CodeBuild
is expecting to find a cdk.out
directory in the root of your CodeBuild project,
but the directory wasn't there. There are two common causes for this:
is in ROLLBACK_COMPLETE state and can not be updated
If you see the following error during execution of your pipeline:
Stack ... is in ROLLBACK_COMPLETE state and can not be updated. (Service:
AmazonCloudFormation; Status Code: 400; Error Code: ValidationError; Request
ID: ...)
The stack failed its previous deployment, and is in a non-retryable state. Go into the CloudFormation console, delete the stack, and retry the deployment.
Cannot find module 'xxxx' or its corresponding type declarations
You may see this if you are using TypeScript or other NPM-based languages,
when using NPM 7 on your workstation (where you generate package-lock.json
)
and NPM 6 on the CodeBuild image used for synthesizing.
It looks like NPM 7 has started writing less information to package-lock.json
,
leading NPM 6 reading that same file to not install all required packages anymore.
Make sure you are using the same NPM version everywhere, either downgrade your workstation's version or upgrade the CodeBuild version.
Cannot find module '.../check-node-version.js' (MODULE_NOT_FOUND)
The above error may be produced by npx
when executing the CDK CLI, or any
project that uses the AWS SDK for JavaScript, without the target application
having been installed yet. For example, it can be triggered by npx cdk synth
if aws-cdk
is not in your package.json
.
Work around this by either installing the target application using NPM before
running npx
, or set the environment variable NPM_CONFIG_UNSAFE_PERM=true
.
Cannot connect to the Docker daemon at unix:///var/run/docker.sock
If, in the 'Synth' action (inside the 'Build' stage) of your pipeline, you get an error like this:
stderr: docker: Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?.
See 'docker run --help'.
It means that the AWS CodeBuild project for 'Synth' is not configured to run in privileged mode,
which prevents Docker builds from happening. This typically happens if you use a CDK construct
that bundles asset using tools run via Docker, like aws-lambda-nodejs
, aws-lambda-python
,
aws-lambda-go
and others.
Make sure you set the privileged
environment variable to true
in the synth definition:
var sourceArtifact = new Artifact();
var cloudAssemblyArtifact = new Artifact();
var pipeline = new CdkPipeline(this, "MyPipeline", new CdkPipelineProps {
CloudAssemblyArtifact = cloudAssemblyArtifact,
SynthAction = SimpleSynthAction.StandardNpmSynth(new StandardNpmSynthOptions {
SourceArtifact = sourceArtifact,
CloudAssemblyArtifact = cloudAssemblyArtifact,
Environment = new BuildEnvironment {
Privileged = true
}
})
});
After turning on privilegedMode: true
, you will need to do a one-time manual cdk deploy of your
pipeline to get it going again (as with a broken 'synth' the pipeline will not be able to self
update to the right state).
Not authorized to perform sts:AssumeRole on arn:aws:iam:::role/-lookup-role-*
You may get an error like the following in the Synth step:
Could not assume role in target account using current credentials (which are for account 111111111111). User:
arn:aws:sts::111111111111:assumed-role/PipelineStack-PipelineBuildSynthCdkBuildProje-..../AWSCodeBuild-....
is not authorized to perform: sts:AssumeRole on resource:
arn:aws:iam::222222222222:role/cdk-hnb659fds-lookup-role-222222222222-us-east-1.
Please make sure that this role exists in the account. If it doesn't exist, (re)-bootstrap the environment with
the right '--trust', using the latest version of the CDK CLI.
This is a sign that the CLI is trying to do Context Lookups during the Synth step, which are failing
because it cannot assume the right role. We recommend you don't rely on Context Lookups in the pipeline at
all, and commit a file called cdk.context.json
with the right lookup values in it to source control.
If you do want to do lookups in the pipeline, the cause is one of the following:
See the section called Context Lookups for more information on using this feature.
IAM policies: Cannot exceed quota for PoliciesPerRole / Maximum policy size exceeded
This happens as a result of having a lot of targets in the Pipeline: the IAM policies that get generated enumerate all required roles and grow too large.
Make sure you are on version 2.26.0
or higher, and that your cdk.json
contains the
following:
{
"context": {
"aws-cdk-lib/aws-iam:minimizePolicies": true
}
}
S3 error: Access Denied
An "S3 Access Denied" error can have two causes:
Self-mutation step has been removed
Some constructs, such as EKS clusters, generate nested stacks. When CloudFormation tries to deploy those stacks, it may fail with this error:
S3 error: Access Denied For more information check http://docs.aws.amazon.com/AmazonS3/latest/API/ErrorResponses.html
This happens because the pipeline is not self-mutating and, as a consequence, the FileAssetX
build projects get out-of-sync with the generated templates. To fix this, make sure the
selfMutating
property is set to true
:
var cloudAssemblyArtifact = new Artifact();
var pipeline = new CdkPipeline(this, "MyPipeline", new CdkPipelineProps {
SelfMutating = true,
CloudAssemblyArtifact = cloudAssemblyArtifact
});
Bootstrap roles have been renamed or recreated
While attempting to deploy an application stage, the "Prepare" or "Deploy" stage may fail with a cryptic error like:
Action execution failed Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: 0123456ABCDEFGH; S3 Extended Request ID: 3hWcrVkhFGxfiMb/rTJO0Bk7Qn95x5ll4gyHiFsX6Pmk/NT+uX9+Z1moEcfkL7H3cjH7sWZfeD0=; Proxy: null)
This generally indicates that the roles necessary to deploy have been deleted (or deleted and re-created);
for example, if the bootstrap stack has been deleted and re-created, this scenario will happen. Under the hood,
the resources that rely on these roles (e.g., cdk-$qualifier-deploy-role-$account-$region
) point to different
canonical IDs than the recreated versions of these roles, which causes the errors. There are no simple solutions
to this issue, and for that reason we strongly recommend that bootstrap stacks not be deleted and re-created
once created.
The most automated way to solve the issue is to introduce a secondary bootstrap stack. By changing the qualifier that the pipeline stack looks for, a change will be detected and the impacted policies and resources will be updated. A hypothetical recovery workflow would look something like this:
$ env CDK_NEW_BOOTSTRAP=1 npx cdk bootstrap \
--qualifier random1234 \
--toolkit-stack-name CDKToolkitTemp \
aws://111111111111/us-east-1
new Stack(this, "MyStack", new StackProps {
// Update this qualifier to match the one used above.
Synthesizer = new DefaultStackSynthesizer(new DefaultStackSynthesizerProps {
Qualifier = "randchars1234"
})
});
Manual Alternative
Alternatively, the errors can be resolved by finding each impacted resource and policy, and correcting the policies
by replacing the canonical IDs (e.g., AROAYBRETNYCYV6ZF2R93
) with the appropriate ARNs. As an example, the KMS
encryption key policy for the artifacts bucket may have a statement that looks like the following:
{
"Effect" : "Allow",
"Principal" : {
// "AWS" : "AROAYBRETNYCYV6ZF2R93" // Indicates this issue; replace this value
"AWS": "arn:aws:iam::0123456789012:role/cdk-hnb659fds-deploy-role-0123456789012-eu-west-1", // Correct value
},
"Action" : [ "kms:Decrypt", "kms:DescribeKey" ],
"Resource" : "*"
}
Any resource or policy that references the qualifier (hnb659fds
by default) will need to be updated.
This CDK CLI is not compatible with the CDK library used by your application
The CDK CLI version used in your pipeline is too old to read the Cloud Assembly produced by your CDK app.
Most likely this happens in the SelfMutate
action, you are passing the cliVersion
parameter to control the version of the CDK CLI, and you just updated the CDK
framework version that your application uses. You either forgot to change the
cliVersion
parameter, or changed the cliVersion
in the same commit in which
you changed the framework version. Because a change to the pipeline settings needs
a successful run of the SelfMutate
step to be applied, the next iteration of the
SelfMutate
step still executes with the old CLI version, and that old CLI version
is not able to read the cloud assembly produced by the new framework version.
Solution: change the cliVersion
first, commit, push and deploy, and only then
change the framework version.
We recommend you avoid specifying the cliVersion
parameter at all. By default
the pipeline will use the latest CLI version, which will support all cloud assembly
versions.
Using Drop-in Docker Replacements
By default, the AWS CDK will build and publish Docker image assets using the
docker
command. However, by specifying the CDK_DOCKER
environment variable,
you can override the command that will be used to build and publish your
assets.
In CDK Pipelines, the drop-in replacement for the docker
command must be
included in the CodeBuild environment and configured for your pipeline.
Adding to the default CodeBuild image
You can add a drop-in Docker replacement command to the default CodeBuild
environment by adding install-phase commands that encode how to install
your tooling and by adding the CDK_DOCKER
environment variable to your
build environment.
IFileSetProducer source; // the repository source
string[] synthCommands; // Commands to synthesize your app
string[] installCommands;
// Commands to install your toolchain
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
// Standard CodePipeline properties...
Synth = new ShellStep("Synth", new ShellStepProps {
Input = source,
Commands = synthCommands
}),
// Configure CodeBuild to use a drop-in Docker replacement.
CodeBuildDefaults = new CodeBuildOptions {
PartialBuildSpec = BuildSpec.FromObject(new Dictionary<string, object> {
{ "phases", new Dictionary<string, IDictionary<string, string[]>> {
{ "install", new Struct {
// Add the shell commands to install your drop-in Docker
// replacement to the CodeBuild enviromment.
Commands = installCommands
} }
} }
}),
BuildEnvironment = new BuildEnvironment {
EnvironmentVariables = new Dictionary<string, BuildEnvironmentVariable> {
// Instruct the AWS CDK to use `drop-in-replacement` instead of
// `docker` when building / publishing docker images.
// e.g., `drop-in-replacement build . -f path/to/Dockerfile`
{ "CDK_DOCKER", new BuildEnvironmentVariable { Value = "drop-in-replacement" } }
}
}
}
});
Using a custom build image
If you're using a custom build image in CodeBuild, you can override the
command the AWS CDK uses to build Docker images by providing CDK_DOCKER
as
an ENV
in your Dockerfile
or by providing the environment variable in the
pipeline as shown below.
IFileSetProducer source; // the repository source
string[] synthCommands;
// Commands to synthesize your app
var pipeline = new CodePipeline(this, "Pipeline", new CodePipelineProps {
// Standard CodePipeline properties...
Synth = new ShellStep("Synth", new ShellStepProps {
Input = source,
Commands = synthCommands
}),
// Configure CodeBuild to use a drop-in Docker replacement.
CodeBuildDefaults = new CodeBuildOptions {
BuildEnvironment = new BuildEnvironment {
// Provide a custom build image containing your toolchain and the
// pre-installed replacement for the `docker` command.
BuildImage = LinuxBuildImage.FromDockerRegistry("your-docker-registry"),
EnvironmentVariables = new Dictionary<string, BuildEnvironmentVariable> {
// If you haven't provided an `ENV` in your Dockerfile that overrides
// `CDK_DOCKER`, then you must provide the name of the command that
// the AWS CDK should run instead of `docker` here.
{ "CDK_DOCKER", new BuildEnvironmentVariable { Value = "drop-in-replacement" } }
}
}
}
});
Known Issues
There are some usability issues that are caused by underlying technology, and cannot be remedied by CDK at this point. They are reproduced here for completeness.
Classes
AddStageOpts | Options to pass to |
ArtifactMap | Translate FileSets to CodePipeline Artifacts. |
AssetType | Type of the asset that is being published. |
CodeBuildOptions | Options for customizing a single CodeBuild project. |
CodeBuildStep | Run a script as a CodeBuild Project. |
CodeBuildStepProps | Construction props for a CodeBuildStep. |
CodeCommitSourceOptions | Configuration options for a CodeCommit source. |
CodePipeline | A CDK Pipeline that uses CodePipeline to deploy CDK apps. |
CodePipelineActionFactoryResult | The result of adding actions to the pipeline. |
CodePipelineFileSet | A FileSet created from a CodePipeline artifact. |
CodePipelineProps | Properties for a |
CodePipelineSource | Factory for CodePipeline source steps. |
ConfirmPermissionsBroadening | Pause the pipeline if a deployment would add IAM permissions or Security Group rules. |
ConnectionSourceOptions | Configuration options for CodeStar source. |
DockerCredential | Represents credentials used to access a Docker registry. |
DockerCredentialUsage | Defines which stages of a pipeline require the specified credentials. |
EcrDockerCredentialOptions | Options for defining access for a Docker Credential composed of ECR repos. |
ECRSourceOptions | Options for ECR sources. |
ExternalDockerCredentialOptions | Options for defining credentials for a Docker Credential. |
FileSet | A set of files traveling through the deployment pipeline. |
FileSetLocation | Location of a FileSet consumed or produced by a ShellStep. |
GitHubSourceOptions | Options for GitHub sources. |
ManualApprovalStep | A manual approval step. |
ManualApprovalStepProps | Construction properties for a |
PermissionsBroadeningCheckProps | Properties for a |
PipelineBase | A generic CDK Pipelines pipeline. |
PipelineBaseProps | Properties for a |
ProduceActionOptions | Options for the |
S3SourceOptions | Options for S3 sources. |
ShellStep | Run shell script commands in the pipeline. |
ShellStepProps | Construction properties for a |
StackAsset | An asset used by a Stack. |
StackDeployment | Deployment of a single Stack. |
StackDeploymentProps | Properties for a |
StackOutputReference | A Reference to a Stack Output. |
StackOutputsMap | Translate stack outputs to Codepipline variable references. |
StackSteps | Instructions for additional steps that are run at stack level. |
StageDeployment | Deployment of a single |
StageDeploymentProps | Properties for a |
Step | A generic Step which can be added to a Pipeline. |
Wave | Multiple stages that are deployed in parallel. |
WaveOptions | Options to pass to |
WaveProps | Construction properties for a |
Interfaces
IAddStageOpts | Options to pass to |
ICodeBuildOptions | Options for customizing a single CodeBuild project. |
ICodeBuildStepProps | Construction props for a CodeBuildStep. |
ICodeCommitSourceOptions | Configuration options for a CodeCommit source. |
ICodePipelineActionFactory | Factory for explicit CodePipeline Actions. |
ICodePipelineActionFactoryResult | The result of adding actions to the pipeline. |
ICodePipelineProps | Properties for a |
IConnectionSourceOptions | Configuration options for CodeStar source. |
IEcrDockerCredentialOptions | Options for defining access for a Docker Credential composed of ECR repos. |
IECRSourceOptions | Options for ECR sources. |
IExternalDockerCredentialOptions | Options for defining credentials for a Docker Credential. |
IFileSetLocation | Location of a FileSet consumed or produced by a ShellStep. |
IFileSetProducer | Any class that produces, or is itself, a |
IGitHubSourceOptions | Options for GitHub sources. |
IManualApprovalStepProps | Construction properties for a |
IPermissionsBroadeningCheckProps | Properties for a |
IPipelineBaseProps | Properties for a |
IProduceActionOptions | Options for the |
IS3SourceOptions | Options for S3 sources. |
IShellStepProps | Construction properties for a |
IStackAsset | An asset used by a Stack. |
IStackDeploymentProps | Properties for a |
IStackSteps | Instructions for additional steps that are run at stack level. |
IStageDeploymentProps | Properties for a |
IWaveOptions | Options to pass to |
IWaveProps | Construction properties for a |