AWS CDK Conditional Stage Deployments - aws-cdk

I have a bit of a challenge here. So, I am using AWS pipelines pretty extensively to do deployments and maintain 50 different deployments in an environment. I have things replicated across multiple accounts / regions using different pipelines. But, I need to start adding some conditional statements into the deployments to either deploy/not deploy a stage in a pipeline depending on environment readiness (think bootstrapping / setup vs long-term running).
So, am trying to find a way in my pipeline to add cfnCondition statements to a stage definition to not add the stage to the pipeline until the environment is ready. I have a stage / script that does a ton of infrastructure setup so don't want the stages runing until that stuff is done. Found this article (https://loige.co/create-resources-conditionally-with-cdk/#using-cfncondition-with-cdk) but 1. cannot find a level 0 construct for a stage and 2. does not seem to like this in cdk v2 / errors on cfnOptions.
Anyone have an option for this so I can get this from SSM / make conditional statements in my stage definitions(btw, if i do this within the project / static local file it works fine, need this from ssm / parameter store so that I can release other pipelines once setups are done).
Thanks
Nick

This is quite simple to achive. You can pass SSM Params to the whole Pipeline by specifying the CodeBuild Environment (where synth is done). Here You can look into the documentation. In the pipeline itself You can refer to this variable like this:
if(process.env.MY_SSM_VAR==='myFancyInput'){
pipeline.addMyConditionalStage()
}

As mentioned: Parameter Store is very useful for this - For any concept that is in every pipeline, but the value changes (ie: the name of the Lambda, or the endpoint for this api) this is very useful.
However, note, Pipelines cannot do If X then Y sort of deals - It is always going to be "Step 1, Step 2, Step 3..." ect ect. - Never "Step 1, then If A, Step 2 Else Step 2.5"
If you have things that are very conditional that depend on other actions, your best bet is to create a Step Function that handles all these things and call it as part of your Pipeline.

Related

Using Rest API to trigger a specific stage within a yaml pipeline

Is there a way to execute a specific stage within a running yaml pipeline which uses an environment with approvals?
I have an on-prem deploy and an on-prem destroy stage both have manual approvals.
What I would like to do is run on-prem destroy stage in the past builds using rest api.
What I achieved so far is get 10 recent builds in descending order for a specific source branch lets call it feature/on-prem-enterprise. Then I do some parsing and find past builds that had a successful deployment but failed, cancelled, or skipped destroy stage, using these results from timeline endpoint, I want to use rest api to run/re-run a destroy stage in those builds.
We get into a situation where we have several deployments but nobody is manually running the destroy stage and because this pipeline is shared amongst all developers for dev builds, its very difficult to find those older builds manually.
If it cannot be achieved, then other solution may be to compile this list of builds and send an email out, but would prefer to have less manual intervention here.
Is there a way to execute a specific stage within a running yaml pipeline which uses an environment with approvals?
The answer is yes.
You could use the REST API Runs - Run Pipeline with below request body to skip other stages to trigger stage which you wanted:
POST https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipelineId}/runs?api-version=6.0-preview.1
Request Body:
{
"stagesToSkip":["Dev","Test"]
}
Postman test result:
And the test result for the pipeline runs:
You can use the Stages - Update REST API method. This is part of the Build resource methods but works fine for YAML Pipelines as well.
PATCH https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/stages/{stageRefName}?api-version=7.1-preview.1
It sounds like you're already getting the buildId programmatically. The stageRefName is the name of the stage as defined in your YAML. Your URI will look something like:
https://dev.azure.com/myorg/myproject/_apis/build/builds/1234/stages/DestroyStageName?api-version=7.1-preview.1
In the request body you'll need:
{
forceRetryAllJobs = $false
state = 1 # state 1 is retry
}
forceRetryAllJobs may be unnecessary. There's an example implementation in PowerShell here.
If you're struggling to identify the appropriate API method to replicate something you do in the Azure DevOps GUI opening your browser's debugger tools and inspecting the requests in the network tab can often help you identify the call that's being used.

How to reusably define an ordered list of build-steps for Jenkins and an arbitrary script?

Situation: I have a Jenkins pipeline job file with lots of (regularly changing) stages hard-coded with Groovy and I need a way to locally reproduce, what being done on CI.
In order to let the developer locally do, "what Jenkins would do" without having to manually keep a list of steps synchronized with the respective Jenkinsfile I'm looking for a way to store the ordered list of stages accessible by both Jenkins and a local script.
In other words - I want to be able to check out my project repository and run
make what-Jenkins-would-do
(make is only an example - I really don't want to use make for this)
So, given a set of scripts which contain what's being executed on each stage I now just need the order of execution stored in a sophisticated way.
What I'd love to have is a way to let Jenkins read a list of pipeline steps from a Yaml/JSON file which then can be used in other scripts, too.
Here is what's going through my mind
I can't be the only one - there must be a tiny nice solution for this need
maybe I could use Groovy locally, but that would add another heavy dependency to my project and the Groovy-scripts contain lot's of Jenkins- and node-specific stuff I don't need locally
Don't want to store information redundantly
Just executing a 'do it all' script in both Jenkins and locally is not an option of course - I need individual stages.
Jenkins / Groovy and pipeline jobs are a requirement - I can't change that
So what's the modern solution to this? Is there something like
node('main') {
stage('checkout') {
// make the source code available
}
stages_from_file("build-stages.yaml")
}
?

Azure Devops : YAML Pipeline for independent Deployment of Single Tenant .Net MVC App for different clients

I need suggestions for creating a YAML pipeline for the independent deployment of the Single Tenant .Net MVC App for different clients.
Application Type: .Net MVC Web API
Web Server: IIS 10
Database: MS SQL Server
Environment: Private Data Center
Number of Clients/tenant: 150+
Deployment: For each client/tenant, a separate IIS Web App is created. Also, a separate database is created for each client.
Expected Deployment: Manual mode (Not considering CD because CI and test suite are not available yet).
If you can guide me about the following points.
How pipeline should be created in such a way that I can use different configuration parameters per client/tenant? (e.g. Database name and connection string) But at the same time, the common script for the deployment of generated release?
Should I create a single pipeline or there should be multiple?
How I should use release, stage, jobs effectively for such a scenario?
If I get some good articles for manual independent deployment for each client, I would like to study.
Generally, if you want to deploy to different environments, you can set up a stage for each environment in a pipeline. However, considering that you have 150+ different configurations, it is so torturous to set up 150+ stages in the pipeline.
If all the deployments have the same deployment steps (same scripts, same input parameters), but different values of the input parameters, you can try using Multi-job configuration (the Matrix) in the pipeline.
With this way, you do not need to set up a stage or a job for each configurations, you just need to set up a stage or a job with all the common deployment steps. But you need to enumerate all the configurations (150+) you require. When running the pipeline, it will generate 150+ matrix jobs with the same deployment steps but different values of the input parameters.
[UPDATE]
Just curious, in this case of multi-job configuration, all the 150+ installations will be triggered in one go, right?
After the pipeline run is triggered, all the 105+ matrix jobs will be triggered and be in queue. However, normally not all the 150+ jobs will be started up to run in parallel at the same time. It depends on the maxParallel you set and how many available agents can be assigned to the run.
I can't select the way, where deployment is started for let's say 5 of the client only.
If you want that the deployment can be executed firstly for some clients and then for other clients, you can try using stages.
For example, in stage_1, execute the deployment job (multi-job configuration) for the first 5 clients. After stage_1, start up stage_2 for another several clients, then stage_3 for other clients, etc..
You can use the dependsOn key to set the execution order of the stages, and use condition key to set a stage only runs when the specified condition is met.
To view more details, you can see "Add stages, dependencies, & conditions".

How to update a Jenkins Properties Global Environment Variable from within a pipeline stage in Jenkinsfile

Wanting to update a ".env" properties value, so that the next execution has a new value.
loggingUtils.info("${env.testVar}")
env.testVar = "cat"
loggingUtils.info("${env.testVar}")
Currently what happens is if I configure the "env.testVar" to have a value of "dog" from within jenkins the print statements will be:
dog
cat
but the next time I execute I want it to be
cat
cat
However, it is always just
dog
cat
Is there a way to achieve setting the environment variables so that future builds will have the new variable? I would prefer to do this without a plugin if possible
Builds (can be thought of as "instances of executions") in Jenkins are independent of each other.
If you are trying to tie builds together by transmuting information across builds I would encourage you to think about what you are really trying to do and suggest you might not doing Continuous Integration properly.
Each time you execute a build it starts from scratch. Continuous Integration always starts with what's in source control. Nothing derived should be committed to source control.
I would suggest environment configuration should normally by stored in config files in source control and applied as appropriate via parameters to a build. (i.e what environment do I want to deploy a given build to?).

Is it possible to create pipeline sharing all variables but some stages could be manualy ran?

I would like to create Jenkins pipeline of few stages. Specificaly pipeline for uploading.
Examples:
Set version to upload -> set other variables -> upload artifacts A -> upload artifacts B -> artifacts C -> correct?
This is ok, but what if I want to upload only B? Do I have to create input "Do you want to upload A/B/C?" and then solve it with if blocks?
Is there better way?
I used to use Task plugin for this, but the problem is that it is too static and also it doesn't share variables between tasks.
Jenkins Pipeline, whether intentionally or unintentionally, doesn't seem to be supportive of "skipping stages". I understand why they would do this - is a pipeline valid if an app takes a different path each time? Not sure.
We have several cases where we do exactly what you describe: pose an input question about what actions to perform, and then have if/else conditionals that perform those actions all within a single stage. We use it typically for "Which DEV environments do you want to deploy to? DEV1, DEV2, etc." since different developers want to use one of multiple dev environments. Our higher environments and deeper testing do not allow these conditionals.

Resources