Is it possible to wrap an existing CDK App into a pipeline, to have the option of creating a pipeline for the application but also doing the regular cdk deploy -all if just wanting to deploy the app locally?
Our current main app looks something like this (over simplified, but just to give idea):
const app = describeApp()
const coreStack = new CoreStack(app, 'CoreStack')
const domainConfig = new DomainConfig(app, 'DomainConfig')
...
What I would then like to do, is add something at the bottom along the lines of:
if (process.env.CREATE_PIPELINE) {
const pipelineApp = new App();
new PipelineStack(pipelineApp, 'PipelineStack', app);
}
With a PipelineStack class that effectively adds the main app as a stage to a pipeline, eg:
export class PipelineStack extends Stack {
constructor(scope: Construct, id: string, app: App, props?: StackProps) {
super(scope, id, props);
const repo = codeCommit.Repository.fromRepositoryName(this, 'Repo', 'XXX')
const pipeline = new CodePipeline(this, 'Pipeline', {
pipelineName: 'Pipeline',
synth: new CodeBuildStep('SynthStep', {
input: CodePipelineSource.codeCommit(repo, 'YYY'),
installCommands: [
'npm install -g yarn',
'cd app',
'yarn install',
'yarn global add aws-cdk'
],
commands: [
'yarn build',
'cdk synth'
]
}
)
});
pipeline.addStage(app)
}
}
This currently complains about my stage not having a stageName, but if I add the hack:
// #ts-ignore
app.stageName = 'DeployApp'
I then get errors around Error: Pipeline stack which uses cross-environment actions must have an explicitly set region.
I feel like there must be a more straightforward way of doing this, but without re-writing my main app class to ONLY allow me to deploy via this new pipeline?
Yes. The important point here is that "local" and "pipeline" deploys have different construct hierarchies:
local: App > (CoreStack, DomainConfig)
pipeline: App > PipelineStack > Stage > (CoreStack, DomainConfig)
Because the stacks' parent is different in each case, it's useful to wrap your app's stacks in a reusable construct. The MyService example in the docs also uses this pattern.
// AppStacks.ts
export class AppStacks extends Construct {
constructor(scope: cdk.App | cdk.Stage, id: string, props: AppStacksProps) {
super(scope, id);
const coreStack = new CoreStack(this, 'CoreStack', props)
const domainConfig = new DomainConfig(this, 'DomainConfig', props)
}
}
For *local* (cdk deploy) deploys, your stacks, wrapped in AppStacks, are children of the App:
// app.ts
const app = new App();
new AppStacks(app, 'Stacks', props);
For *pipeline* deploys, your stacks are children of a Stage, and the Stage is added to the Pipeline.
// DeployStage.ts
export class DeployStage extends cdk.Stage {
constructor(scope: Construct, id: string, props: DeployStageProps) {
super(scope, id, { ...props, env: props.config.env });
new AppStacks(this, 'Stacks', props);
}
}
// app-pipeline.ts
const app = new App();
new PipelineStack(app, 'Pipeline', props); // add the stage: pipeline.addStage(new DeployStage...)
I prefer to put my pipeline app setup in a separate file. But you could also have one "app" file with conditional logic, as in the OP.
Related
I'm integrating Squish automation tool and Jenkins pipeline. Everything went smoothly. Now I need to send email report after the job's done. I have a Groovy file in pre-send script, but when this script runs, it throws out exception:
java.lang.NullPointerException: Cannot invoke method getRootDir() on null object
I figured out the "build" object in my Groovy script is Null. Not sure why it is. Please note if I use built-in Squish plugin and Editable Email on Jenkins, everything went smoothly. The problem just happen when I moved to use Pipeline.
### - This is my Groovy script:
List getJenkinsTestResultFiles() {
File squishResultsPath = new File( build.getRootDir(), "squishResults" )
if ( !squishResultsPath.exists() || !squishResultsPath.isDirectory() ) {
throw new GroovyRuntimeException( "Squish results path does not exist at: " + squishResultsPath.getAbsolutePath() )
}
File summaryFile = new File( squishResultsPath, "summary.xml" )
if ( !summaryFile.exists() || !summaryFile.isFile() ) {
throw new GroovyRuntimeException( "Squish summary file does not exist at: " + summaryFile.getAbsolutePath() )
}
List resultFiles = []
def summaries = new XmlSlurper().parse( summaryFile )
summaries.summary.each {
resultFiles.push( new File( squishResultsPath, it.xmlFileName.text() ) )
}
return resultFiles
}
### - This is my Pipeline script:
node('Slave_10.133.88.151') {
stage('Squish Test') {
step([$class: 'SquishBuilder',
abortBuildOnError: false,
extraOptions: '',
host: '',
port: '',
resultFolder: "${WORKSPACE}\\Squish_Report",
skipTestCases: false,
snoozeFactor: '1',
squishConfig: 'Default',
testCase: '',
testSuite: "${WORKSPACE}\\${TEST_SUITE}"])
}
stage('Send Email') {
emailext body: 'Test',
postsendScript: '${SCRIPT, template="SquishSummary.groovy"}',
subject: 'Pipeline',
to: 'hoang#local.com'
}
}
The build object is a hudson.model.Build object, and since you are calling a shared library you'll have to import the Build object in your groovy script.
import hudson.model.Build
At the top of your shared library.
If you have already imported the object then the issue could be that you haven't initialized it inside of your shared library.
I'm hoping to find a way to remove duplicated stages in a Declarative Jenkinsfile by loading environment variables based on the branch.
Currently I have something like:
#Library("MySharedLibrary#v1") _
String tagBaseDev = "repo.org/myspace/image:dev"
String tagBaseTest = "repo.org/myspace/image:test"
String tagBaseProd = "repo.org/myspace/image:prod"
pipeline {
agent none
stages {
// Always Run This
stage ('Maven Build and Unit Tests') {
agent {label 'docker-slave'}
steps {
sharedLibraryBuild mavenGoals:'clean package', additionalProps:['ci.env':'']
stash 'artifacts'
}
}
// Dev Only
stage ('Build Dev Docker Image and Push') {
when {
branch 'dev'
}
agent {label 'docker-slave'}
steps {
unstash 'artifacts'
sharedLibraryDockerImageBuildPush tag:"$tagBaseDev"
}
}
// Test Only
stage ('Build Test Docker Image and Push') {
when {
branch 'test'
}
agent {label 'docker-slave'}
steps {
unstash 'artifacts'
sharedLibraryDockerImageBuildPush tag:"$tagBaseTest"
}
}
// Prod Only
stage ('Build Prod Docker Image and Push') {
when {
branch 'prod'
}
agent {label 'docker-slave'}
steps {
unstash 'artifacts'
sharedLibraryDockerImageBuildPush tag:"$tagBaseProd"
}
}
}
}
I want to be able to reduce that into one stage block and dynamically load in the needed $tagBaseXXX based on branch. This is just an example but I'm planning to have four or five variables that will have different values for each environment.
My thought was to create EnvDev, EnvTest, and EnvProd maps with corresponding values and then create a EnvMap which is a Map that correlates the branch name to Environment Map. For instance:
def EnvDev = [
url: "dev.com",
tag: "dev",
var: "Dev Var"
]
def EnvProd = [
url: "prod.com",
tag: "prod",
var: "prod Var"
]
def EnvMap = [
dev: EnvDev,
prod: EnvProd
]
I then try to create a Shared Library call that looks something like this:
def call(String branch, Map envMapping) {
Map use_me = envMapping.get("${branch}")
String url = use_me.get("URL")
echo ("${url}")
}
With the idea being to pass the Map and pull the corresponding environment map based on the branch and then use the variables as needed.
So I have something like this:
#Library("MySharedLibrary#v1") _
def EnvDev = [
url: "dev.com",
tag: "dev",
var: "Dev Var"
]
def EnvProd = [
url: "prod.com",
tag: "prod",
var: "prod Var"
]
def EnvMap = [
dev: EnvDev,
prod: EnvProd
]
pipeline {
agent {label 'docker-slave'}
stages {
stage ('Test Env Vars') {
steps {
echo "$env.GIT_BRANCH"
sharedLibrarySetupEnv branch: "$env.GIT_BRANCH", evnMapping: EnvMap
}
}
}
}
But I get the following error:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: setupEnv.call() is applicable for argument types: (java.util.LinkedHashMap) values: [[branch:dev, env_mapping:[dev:[url:dev.com, tag:dev, var:Dev Var], ...]]]
Possible solutions: call(java.lang.String, java.util.Map), wait(), any(), wait(long), main([Ljava.lang.String;), each(groovy.lang.Closure)
Is there an easier way to accomplish what I'm trying to do?
This is my first time trying to write a Shared Library function so I'm guessing it may just be some Groovy syntax/concept I'm not familiar with.
Thanks!
your function signature is def call(String branch, Map envMapping), but your calling is branch: xxx, env_mapping:xxx.
Change to sharedLibrarySetupEnv branch: "$env.GIT_BRANCH", envMapping: EnvMap
The problem was with how I was trying to invoke the Shared Library function. I thought I was able to reference the variable names which led to the Jenkinsfile/pipeline passing a LinkedHashMap to the shared library and not two separate variables.
There are two solutions to this:
Have the Shared Library call method take in a Map<String, Object> parms and within the call reference the variables with parms.varname.
Shared Library:
def call(Map<String, Object> parms) {
echo "${parms.branch}"
Map use_this_map = parms.envMapping.get(branch)
}
Jenkinsfile:
setupEnv branch: "$env.GIT_BRANCH", envMapping: EnvMap
Don't pass the variable names in the Jenkinsfile and have the Shared Library call method take in corresponding variables.
Shared Library:
def call(String branch, Map<String, Map> envMapping) {
echo "${branch}"
Map use_this_map = envMapping.get(branch)
}
Jenkinsfile:
setupEnv $env.GIT_BRANCH, EnvMap
You can use the variable BRANCH_NAME and put it in a condition like below:-
if (env.BRANCH_NAME == master)
{
//set all the environment variable you need
} else {
//variable required if the condition doesn't match
}
You can use REGEX in the condition.
I'd like to use a withCredentials() block in a shared-variable ("vars/") script rather than directly in the Jenkins pipeline because this is a lower-level semantic of a particular library, and also may or may not be required depending on the situation. However, withCredentials (or, at least, that signature of it) doesn't appear to be in scope.
script:
def credentials = [
[$class: 'UsernamePasswordMultiBinding', credentialsId: '6a55c310-aaf9-4822-bf41-5500cd82af4e', passwordVariable: 'GERRIT_PASSWORD', usernameVariable: 'GERRIT_USERNAME'],
[$class: 'StringBinding', credentialsId: 'SVC_SWREGISTRY_PASSWORD', variable: 'SVC_SWREGISTRY_PASSWORD']
]
withCredentials(credentials) {
// ...
}
Console:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: BuildagentInstallAndRun.withCredentials() is applicable for argument types: (java.util.ArrayList, org.jenkinsci.plugins.workflow.cps.CpsClosure2) values: [[[$class:UsernamePasswordMultiBinding, credentialsId:6a55c310-aaf9-4822-bf41-5500cd82af4e, ...], ...], ...]
Has anyone had any success with this?
I'm using a shared library rather than a shared variable, but I guess it is a similar situation.
I'm not using the $class parameter, but i'm calling directly one of the functions suggested by the pipeline snippet generator. You can have a list here. In the example below, I use the usernameColonPassword binding.
In the pipeline, I instantiate the class utilities and I pass this to the constructor. Then, in the library, I use the step object to access the pipeline steps (such as withCredentials or usernameColonPassword).
class Utilities implements Serializable {
def steps
Utilities(steps) {
this.steps = steps
}
def doArchiveToNexus(String credentials, String artifact, String artifact_registry_path){
try {
this.steps.withCredentials([steps.usernameColonPassword(credentialsId: credentials, variable: 'JENKINS_USER')]) {
this.steps.sh "curl --user " + '${JENKINS_USER}' + " --upload-file ${artifact} ${artifact_registry_path}"
}
} catch (error){
this.steps.echo error.getMessage()
throw error
}
}
}
You can try following:
import jenkins.model.*
credentialsId = '6a55c310-aaf9-4822-bf41-5500cd82af4e'
def creds = com.cloudbees.plugins.credentials.CredentialsProvider.lookupCredentials(
com.cloudbees.plugins.credentials.common.StandardUsernameCredentials.class, Jenkins.instance, null, null ).find{
it.id == credentialsId}
println creds.username
println creds.password
But it is not secure, everything will be in console log
I was able to obtain credentials inside the shared library with proper passwords masking with such code:
class Utilities implements Serializable {
def steps
Utilities(steps) {
this.steps = steps
}
def execute() {
this.steps.withCredentials(
bindings: [
this.steps.usernameColonPassword(
credentialsId: this.credentialsId,
variable: "unameColonPwd")
]) {
this.steps.sh "echo {this.steps.env.unameColonPwd}"
}
}
pipeline {
agent any
stages {
stage("foo") {
steps {
script {
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [choice(name: 'RELEASE_SCOPE', choices: 'patch\nminor\nmajor',
description: 'What is the release scope?')]
}
echo "${env.RELEASE_SCOPE}"
}
}
}
}
In this above code, The choice are hardcoded (patch\nminor\nmajor) -- My requirement is to dynamically give choice values in the dropdown.
I get the values from calling api - Artifacts list (.zip) file names from artifactory
In the above example, It request input when we do the build, But i want to do a "Build with parameters"
Please suggest/help on this.
Depends how you get data from API there will be different options for it, for example let's imagine that you get data as a List of Strings (let's call it releaseScope), in that case your code be following:
...
script {
def releaseScopeChoices = ''
releaseScope.each {
releaseScopeChoices += it + '\n'
}
parameters: [choice(name: 'RELEASE_SCOPE', choices: ${releaseScopeChoices}, description: 'What is the release scope?')]
}
...
hope it will help.
This is a cutdown version of what we use. We separate stuff into shared libraries but I have consolidated a bit to make it easier.
Jenkinsfile looks something like this:
#!groovy
#Library('shared') _
def imageList = pipelineChoices.artifactoryArtifactSearchList(repoName, env.BRANCH_NAME)
imageList.add(0, 'build')
properties([
buildDiscarder(logRotator(numToKeepStr: '20')),
parameters([
choice(name: 'ARTIFACT_NAME', choices: imageList.join('\n'), description: '')
])
])
Shared library that looks at artifactory, its pretty simple.
Essentially make GET Request (And provide auth creds on it) then filter/split result to whittle down to desired values and return list to Jenkinsfile.
import com.cloudbees.groovy.cps.NonCPS
import groovy.json.JsonSlurper
import java.util.regex.Pattern
import java.util.regex.Matcher
List artifactoryArtifactSearchList(String repoKey, String artifact_name, String artifact_archive, String branchName) {
// URL components
String baseUrl = "https://org.jfrog.io/org/api/search/artifact"
String url = baseUrl + "?name=${artifact_name}&repos=${repoKey}"
Object responseJson = getRequest(url)
String regexPattern = "(.+)${artifact_name}-(\\d+).(\\d+).(\\d+).${artifact_archive}\$"
Pattern regex = ~ regexPattern
List<String> outlist = responseJson.results.findAll({ it['uri'].matches(regex) })
List<String> artifactlist=[]
for (i in outlist) {
artifactlist.add(i['uri'].tokenize('/')[-1])
}
return artifactlist.reverse()
}
// Artifactory Get Request - Consume in other methods
Object getRequest(url_string){
URL url = url_string.toURL()
// Open connection
URLConnection connection = url.openConnection()
connection.setRequestProperty ("Authorization", basicAuthString())
// Open input stream
InputStream inputStream = connection.getInputStream()
#NonCPS
json_data = new groovy.json.JsonSlurper().parseText(inputStream.text)
// Close the stream
inputStream.close()
return json_data
}
// Artifactory Get Request - Consume in other methods
Object basicAuthString() {
// Retrieve password
String username = "artifactoryMachineUsername"
String credid = "artifactoryApiKey"
#NonCPS
credentials_store = jenkins.model.Jenkins.instance.getExtensionList(
'com.cloudbees.plugins.credentials.SystemCredentialsProvider'
)
credentials_store[0].credentials.each { it ->
if (it instanceof org.jenkinsci.plugins.plaincredentials.StringCredentials) {
if (it.getId() == credid) {
apiKey = it.getSecret()
}
}
}
// Create authorization header format using Base64 encoding
String userpass = username + ":" + apiKey;
String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes());
return basicAuth
}
I could achieve it without any plugin:
With Jenkins 2.249.2 using a declarative pipeline,
the following pattern prompt the user with a dynamic dropdown menu
(for him to choose a branch):
(the surrounding withCredentials bloc is optional, required only if your script and jenkins configuration do use credentials)
node {
withCredentials([[$class: 'UsernamePasswordMultiBinding',
credentialsId: 'user-credential-in-gitlab',
usernameVariable: 'GIT_USERNAME',
passwordVariable: 'GITLAB_ACCESS_TOKEN']]) {
BRANCH_NAMES = sh (script: 'git ls-remote -h https://${GIT_USERNAME}:${GITLAB_ACCESS_TOKEN}#dns.name/gitlab/PROJS/PROJ.git | sed \'s/\\(.*\\)\\/\\(.*\\)/\\2/\' ', returnStdout:true).trim()
}
}
pipeline {
agent any
parameters {
choice(
name: 'BranchName',
choices: "${BRANCH_NAMES}",
description: 'to refresh the list, go to configure, disable "this build has parameters", launch build (without parameters)to reload the list and stop it, then launch it again (with parameters)'
)
}
stages {
stage("Run Tests") {
steps {
sh "echo SUCCESS on ${BranchName}"
}
}
}
}
The drawback is that one should refresh the jenkins configration and use a blank run for the list be refreshed using the script ...
Solution (not from me): This limitation can be made less anoying using an aditional parameters used to specifically refresh the values:
parameters {
booleanParam(name: 'REFRESH_BRANCHES', defaultValue: false, description: 'refresh BRANCH_NAMES branch list and launch no step')
}
then wihtin stage:
stage('a stage') {
when {
expression {
return ! params.REFRESH_BRANCHES.toBoolean()
}
}
...
}
this is my solution.
def envList
def dockerId
node {
envList = "defaultValue\n" + sh (script: 'kubectl get namespaces --no-headers -o custom-columns=":metadata.name"', returnStdout: true).trim()
}
pipeline {
agent any
parameters {
choice(choices: "${envList}", name: 'DEPLOYMENT_ENVIRONMENT', description: 'please choose the environment you want to deploy?')
booleanParam(name: 'SECURITY_SCAN',defaultValue: false, description: 'container vulnerability scan')
}
The example of Jenkinsfile below contains AWS CLI command to get the list of Docker images from AWS ECR dynamically, but it can be replaced with your own command. Active Choices Plug-in is required.
Note! You need to approve the script specified in parameters after first run in "Manage Jenkins" -> "In-process Script Approval", or open job configuration and save it to approve
automatically (might require administrator permissions).
properties([
parameters([[
$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'image',
description: 'Docker image',
filterLength: 1,
filterable: false,
script: [
$class: 'GroovyScript',
fallbackScript: [classpath: [], sandbox: false, script: 'return ["none"]'],
script: [
classpath: [],
sandbox: false,
script: '''\
def repository = "frontend"
def aws_ecr_cmd = "aws ecr list-images" +
" --repository-name ${repository}" +
" --filter tagStatus=TAGGED" +
" --query imageIds[*].[imageTag]" +
" --region us-east-1 --output text"
def aws_ecr_out = aws_ecr_cmd.execute() | "sort -V".execute()
def images = aws_ecr_out.text.tokenize().reverse()
return images
'''.stripIndent()
]
]
]])
])
pipeline {
agent any
stages {
stage('First stage') {
steps {
sh 'echo "${image}"'
}
}
}
}
choiceArray = [ "patch" , "minor" , "major" ]
properties([
parameters([
choice(choices: choiceArray.collect { "$it\n" }.join(' ') ,
description: '',
name: 'SOME_CHOICE')
])
])
I have the following Jenkins DSL file:
if (params["BUILD_SNAPSHOT"] == "true") {
parallel(
{
build("company-main-build-snapshot")
},
{
build("1-company-worker-build-snaphsot", WORKER_NAME: "sharding-worker")
}
)
}
parallel (
{
build("company-deployment-info",
API_KEY: "aaaaa5dd4cd58b94215f9cddd4441c391b4ddde226ede98",
APP: "company-Staging-App")
},
{
build("company-salt-role-deploy",
ENV: "staging",
ROLE: "app")
},
{
build("company-deployment-info",
API_KEY: "aaaaa5dd4cd58b94215f9cddd4441c391b4ddde226ede98",
APP: "company-Staging-Shardwork")
},
{
build("company-salt-workers-deploy",
ENVIRONMENT: "staging",
WORKER_TYPE: "shardwork")
}
)
if (params["REST_TEST"] == "true") {
build("company_STAGING_python_rest_test")
}
My task is to convert/rewrite this workflow file content to Jenkins pipeline Jenkinsfile.
I have some example files for reference but I'm having a hard time understanding how I should even begin...
Can anyone please shed some light on this subject?
First, have a good look at Jenkins pipeline documentation, it is a great start and it is providing a whole bunch of information such as Build Parameters usage or parallel steps.
Here are a few more hints for you to explore :
Parameters
Just use the parameter name as a variable such as :
if (BUILD_SNAPSHOT) {
...
}
Call other jobs
You can also use build step such as :
build job: '1-company-worker-build-snaphsot', parameters: [stringParam(name: 'WORKER_NAME', value: "sharding-worker")]
Use functions
Instead of calling downstream jobs using build steps each time, you might want to consider using pipeline functions from another Groovy script, either from your current project or even from an external, checked out Groovy script.
As an example, you could replace your second job call from :
build("1-company-worker-build-snaphsot", WORKER_NAME: "sharding-worker")
to :
git 'http://urlToYourGit/projectContainingYourScript'
pipeline = load 'global-functions.groovy'
pipeline.buildSnapshot("sharding-worker")
...of course the init phase (Git checkout and pipeline loading) is only needed once before you can call all your external scripts functions.
In short
To sum it up a little bit, your code could be converted to something along these lines :
node {
git 'http://urlToYourGit/projectContainingYourScript'
pipeline = load 'global-functions.groovy'
if(BUILD_SNAPSHOT) {
parallel (
phase1: { pipeline.buildMainSnapshot() },
phase2: { pipeline.buildWorkerSnapshot("sharding-worker") }
)
}
parallel (
phase1: { pipeline.phase1(params...) },
phase2: { pipeline.phase2(params...) },
phase3: { pipeline.phase3(params...) },
phase4: { pipeline.phase4(params...) }
)
if (REST_TEST) {
pipeline.finalStep()
}
}