Problem
I want to set a parameter conditionally based on which branch triggered the pipeline. If the triggered branch was feature/automated-testing, I would like to set a parameter equal to "True". See the code below.
Parts of my pipeline.yml file looks like so:
trigger:
branches:
include:
- feature/automated-testing
...
# Global variables for the pipeline
variables:
- name: "triggerRepoName"
value: "$(Build.SourceBranchName)"
stages:
# common stage. Docker build, tag and push
- stage: BuildDockerImage
displayName: "Build docker image"
variables:
...
jobs:
- template: /templates/pipelines/my-prject.yml#templates
parameters:
${{ if eq( variables.triggerRepoName, 'feature/automated-testing') }}:
runTests: "True"
${{ if ne(variables.triggerRepoName, 'feature/automated-testing') }}:
runTests: "False"
Question
When I push from branch feature/automated-testing and ´echo´ the variable runTests in the Dockerfile, it is blank. Is there something wrong with my syntax in the conditional statement?
I believe the error is in the way the variable is set conditionally, and I have therefore chosen not to supply the Dockerfile nor the other .yml template .yml used.
Please change variables.triggerRepoName to variables['triggerRepoName']. It should solve your issue.
Related
Right now, if I want to pass a environment variable to a reusable workflow I have to do something like this:
name: Reusable workflow
on:
workflow_call:
inputs:
my_env_var:
required: false
type: string
env:
my_env_var: ${{ inputs.my_env_var }}
However, for this I first need to define as an input each environment variable I want to pass. This works, but having to hard code the environment variables makes my reusable workflows less generic. Is there a way to pass envs without having to define them one by one? I was thinking on something like this:
name: Calling reusable workflow
on:
workflow_dispatch:
jobs:
push-image-dev:
uses: ./.github/workflows/my-reusable-workflow.yml
with:
input1: ...
input2: ...
env:
env1: ...
env2: ...
However, I have been reading some documentation and I don't think that exists. Is there any other way of doing it, as inheriting env variables or creating a single input which is a variable dictionary, which is later parsed and sets all the env vars in the reusable workflow?
I created this simple logic which allows me to pass all the environment variables I want on a 1:N relationship with respect to inputs. I created an input which expects a list of environment variables, formatted as "env=value", and which is later converted to environment variables in a step inside the workflow.
Calling the workflow:
name: Calling reusable workflow
on:
workflow_dispatch:
jobs:
my-job:
uses: ./.github/workflows/my-reusable-workflow.yml
with:
env_vars: |
env1=value1
env2=value2
env3=value3
Workflow definition:
name: My reusable workflow
on:
workflow_call:
inputs:
env_vars:
description: List of environment variables to set up, given in env=value format.
required: false
type: string
jobs:
my-job:
runs-on: ubuntu-latest
steps:
- name: Set environment variables
if: ${{ inputs.env_vars }}
run: |
for i in "${{ inputs.env_vars }}"
do
printf "%s\n" $i >> $GITHUB_ENV
done
I have a yml file which calls a templated stage:
- template: pipeline-templates/iac/deploy-template.yml
parameters:
envName: dev
varGroup: dev-variables-group-library
In the template I have made use of the template parameter:
parameters:
envName: ''
varGroup: ''
stages:
- stage: Deploy_for_${{ parameters.envName }}
I want to use the envName variable from the variable group, instead of the template parameters. I have tried template code like this:
parameters:
envName: ''
varGroup: ''
stages:
- stage: Deploy_for_$( envName )
When I do this, the pipeline throws an unrelated error, for example stating that the stage name cannot be duplicate. This is clearly because the variable envName is not being read.
There are other places in my template where the pipeline variable is read and used successfully, so what am I doing wrong in this particular part of the template script?
So if wanted to use a variable group need to define the group to be read in:
- group: ${{parameters.varGroup}}
After doing this should be able to reference by $(envName) or if compile time is alright can also specify ${{ varaibles.envName }}
Microsoft Documentation
Recently I've made some configuration on my team's github circleci. I needed to use a when statement to devide ci logics. I referenced this document(https://circleci.com/docs/2.0/configuration-reference/#logic-statements) but it seems the document not correct.
Below is my step definition:
...
image_build_step:
executor: golang_executor
steps:
- checkout
- setup_remote_docker:
version: 18.09.3
docker_layer_caching: true
- define_svc_name:
jobname: ${CIRCLE_JOB} # On this step set $SVC variable
- when:
conditon:
equal: ["${SVC}", "SVC_A" ]
- aws-ecr/build-and-push-image:
repo: SVC_A_REPO
dockerfile: ./Dockerfile
tag: "latest,${CIRCLE_SHA1},build-${CIRCLE_BUILD_NUM}"
...
Also I already tried this.
...
image_build_step:
executor: golang_executor
steps:
- checkout
- setup_remote_docker:
version: 18.09.3
docker_layer_caching: true
- define_svc_name:
jobname: ${CIRCLE_JOB} # On this step set $SVC variable
- when:
equal: ["${SVC}", "SVC_A" ]
- aws-ecr/build-and-push-image:
repo: SVC_A_REPO
dockerfile: ./Dockerfile
tag: "latest,${CIRCLE_SHA1},build-${CIRCLE_BUILD_NUM}"
...
I cannot figure out my mistake using when statement on circleci. Additionaly, I already passed circleci config validate .circleci/config.yaml command before I pushed this commit.
What is the correct usage of when statement in circleci? Joining circleci forum is also annoying me using github account, so I leave my question on stakeoverflow.
It's not possible to use environment variables in logic statements. The reason is that logic statements are evaluated at configuration compilation time, whereas environment variables are interpolated at run time.
The only workaround I know of is to use the CircleCI dynamic configuration functionality to set pipeline parameters' values in the "setup workflow" that you then pass to the "continuation" workflow.
Jenkins has a UI concept with dropdown lists, etc. to allow users to specify variables at run time. This has proven essential in our builds to make decisions in the pipeline (ie. which agent to run on, code base to choose, etc). By allowing parameters we are able to have a single pipeline/definition handle the same task for many clients/releases/environments.
I have been watching many people over the past year ask for this - to eliminate the number of almost identical build definitions - is there a best practice to handle this? Would be nice to have a single build definition for a specific task that can be smart enough to handle parameters.
Edit : example of possible pseudo-code to build on levi-lu#MSFT's suggestion.
parameters:
- name: ClientName
displayName: Pool Image
default: Select client
values: powershell
valuesScript : [
assemble curl request to http://myUrl.com/Clients/GetAll
]
- name: TargetEnvironment
displayName: Client Environment
type: string
values: powershell
valuesScript: [
assemble curl request using above parameter value to
https://myUrl.com/Clients/$(ClientName)/GetEnvironments
]
trigger: none
jobs:
- job: build
displayName: Run pipeline job
pool:
vmImage: windows-latest
parameters:
ClientName : $(ClientName)
TargetEnvironment : $(TargetEnvironment)
steps:
- script: echo building $(Build.BuildNumber)
Runtime parameters is available now. You can now set runtime parameters at the beginning of your pipepline YAML using parameters. For below example:
parameters:
- name: image
displayName: Pool Image
default: ubuntu-latest
values:
- windows-latest
- vs2017-win2016
- ubuntu-latest
- ubuntu-16.04
- macOS-latest
- macOS-10.14
- name: test
displayName: Run Tests?
type: boolean
default: false
trigger: none
jobs:
- job: build
displayName: Build and Test
pool:
vmImage: ${{ parameters.image }}
steps:
- script: echo building $(Build.BuildNumber)
- ${{ if eq(parameters.test, true) }}:
- script: echo "Running all the tests"
Above example is from Microsoft official document. Click here to learn more about runtime parameters.
When you run above Yaml pipeline, You will be able to select the parameter's value from the dropdown list. See below screenshot.
Update: To set variables dynamically at runtime.
You can use the task.setvariable logging command to set variables dynamically in scripts.
For below example: $resultValue is the value from rest api call. And its value is assigned to variable VariableName
- powershell: |
$resultValue = call from Rest API
echo "##vso[task.setvariable variable=VariableName]$resultValue"
Check document here for more information.
I am investigating using Jenkins Job-Builder (from OpenStack) as our means of managing jenkins job configurations. In doing so I am trying to figure out the right (best?) way to include a job-template from an external file using the !include custom tag.
In the current use case we will basically have one template that is going to be used by a LOT of job. Each job is going to need to exist in its own file for reason that are out of scope here.
So far I have gotten this to work:
job-template.yml
name: 'pre-build-{proj}-{repo}'
project-type: freestyle
... etc ...
job-1.yml
- job-template:
!include job-template.yml
- project:
name: job-1
proj: my-proj
repo: my-repo
jobs:
- 'build-config-{proj}-{repo}'
This seem wrong because the template definition gets split across both files and require needless duplication of the -job-template: line in every job file. I would like to get the following to work instead:
job-template.yml
- job-template:
name: 'pre-build-{proj}-{repo}'
project-type: freestyle
... etc ...
job-1.yml
!include job-template.yml
- project:
name: job-1
proj: my-proj
repo: my-repo
jobs:
- 'build-config-{proj}-{repo}'
The latter unfortunately results in a yaml parse error on the - project: line:
yaml.scanner.ScannerError: mapping values are not allowed here
in "job-1.yml", line 3, column 10
Is there way to get the entire template definition into the template file? This will become particularly annoying if ever we need to pull in multiple templates from multiple files.
Jenkins-jobs takes a path argument which can be a directory holding your files (job-template.yaml, job-1.yaml and job-2.yaml. It will assemble them as a single YAML document, so you do not need to use !include. So you can write:
job-template.yaml
- job-template:
name: 'pre-build-{proj}-{repo}'
builders:
- shell: 'echo building {proj} for {repo}'
job1.yaml
- project:
name: job-1
proj: my-proj
repo: my-repo
jobs:
- 'pre-build-{proj}-{repo}'
job2.yaml
- project:
name: job-2
proj: my-other-proj
repo: my-other-repo
jobs:
- 'pre-build-{proj}-{repo}'
That will generates two jobs with the following shell commands:
pre-build-my-other-proj-my-other-repo:
<command>echo building my-other-proj for my-other-repo</command>
pre-build-my-proj-my-repo:
<command>echo building my-proj for my-repo</command>
Assuming the files are in a directory config/ you can generate them all with:
jenkins-jobs test config/ -o /tmp/myjobs
Or use the name argument to filter the jobs that will be realized:
jenkins-jobs test config/ -o /tmp/myjobs '*my-proj*'
# Creates pre-build-my-proj-my-repo
# Skips pre-build-my-other-proj-my-other-repo