Azure pipelines UI to accept parameters (like Jenkins) - jenkins

Jenkins has a UI concept with dropdown lists, etc. to allow users to specify variables at run time. This has proven essential in our builds to make decisions in the pipeline (ie. which agent to run on, code base to choose, etc). By allowing parameters we are able to have a single pipeline/definition handle the same task for many clients/releases/environments.
I have been watching many people over the past year ask for this - to eliminate the number of almost identical build definitions - is there a best practice to handle this? Would be nice to have a single build definition for a specific task that can be smart enough to handle parameters.
Edit : example of possible pseudo-code to build on levi-lu#MSFT's suggestion.
parameters:
- name: ClientName
displayName: Pool Image
default: Select client
values: powershell
valuesScript : [
assemble curl request to http://myUrl.com/Clients/GetAll
]
- name: TargetEnvironment
displayName: Client Environment
type: string
values: powershell
valuesScript: [
assemble curl request using above parameter value to
https://myUrl.com/Clients/$(ClientName)/GetEnvironments
]
trigger: none
jobs:
- job: build
displayName: Run pipeline job
pool:
vmImage: windows-latest
parameters:
ClientName : $(ClientName)
TargetEnvironment : $(TargetEnvironment)
steps:
- script: echo building $(Build.BuildNumber)

Runtime parameters is available now. You can now set runtime parameters at the beginning of your pipepline YAML using parameters. For below example:
parameters:
- name: image
displayName: Pool Image
default: ubuntu-latest
values:
- windows-latest
- vs2017-win2016
- ubuntu-latest
- ubuntu-16.04
- macOS-latest
- macOS-10.14
- name: test
displayName: Run Tests?
type: boolean
default: false
trigger: none
jobs:
- job: build
displayName: Build and Test
pool:
vmImage: ${{ parameters.image }}
steps:
- script: echo building $(Build.BuildNumber)
- ${{ if eq(parameters.test, true) }}:
- script: echo "Running all the tests"
Above example is from Microsoft official document. Click here to learn more about runtime parameters.
When you run above Yaml pipeline, You will be able to select the parameter's value from the dropdown list. See below screenshot.
Update: To set variables dynamically at runtime.
You can use the task.setvariable logging command to set variables dynamically in scripts.
For below example: $resultValue is the value from rest api call. And its value is assigned to variable VariableName
- powershell: |
$resultValue = call from Rest API
echo "##vso[task.setvariable variable=VariableName]$resultValue"
Check document here for more information.

Related

Pass environment variables to reusable workflow without using 1 to 1 inputs

Right now, if I want to pass a environment variable to a reusable workflow I have to do something like this:
name: Reusable workflow
on:
workflow_call:
inputs:
my_env_var:
required: false
type: string
env:
my_env_var: ${{ inputs.my_env_var }}
However, for this I first need to define as an input each environment variable I want to pass. This works, but having to hard code the environment variables makes my reusable workflows less generic. Is there a way to pass envs without having to define them one by one? I was thinking on something like this:
name: Calling reusable workflow
on:
workflow_dispatch:
jobs:
push-image-dev:
uses: ./.github/workflows/my-reusable-workflow.yml
with:
input1: ...
input2: ...
env:
env1: ...
env2: ...
However, I have been reading some documentation and I don't think that exists. Is there any other way of doing it, as inheriting env variables or creating a single input which is a variable dictionary, which is later parsed and sets all the env vars in the reusable workflow?
I created this simple logic which allows me to pass all the environment variables I want on a 1:N relationship with respect to inputs. I created an input which expects a list of environment variables, formatted as "env=value", and which is later converted to environment variables in a step inside the workflow.
Calling the workflow:
name: Calling reusable workflow
on:
workflow_dispatch:
jobs:
my-job:
uses: ./.github/workflows/my-reusable-workflow.yml
with:
env_vars: |
env1=value1
env2=value2
env3=value3
Workflow definition:
name: My reusable workflow
on:
workflow_call:
inputs:
env_vars:
description: List of environment variables to set up, given in env=value format.
required: false
type: string
jobs:
my-job:
runs-on: ubuntu-latest
steps:
- name: Set environment variables
if: ${{ inputs.env_vars }}
run: |
for i in "${{ inputs.env_vars }}"
do
printf "%s\n" $i >> $GITHUB_ENV
done

CirclCI Pipeline Set a Variable within a Job and read it from Other with Condition evaluated with empty

Please I'm trying to run some steps in the CircleCI Pipeline with conditions happened in the previous step. I tried a lot of tricks like exposing the value from Step 1 to global vars and pickup it in Step 2, I can see and print the variables in Step 2 but using WHEN BLOCK forever evaluated with Empty. I searched a lot and I knew that logical conditions already evaluated before running the jobs, Please I need alternative way to execute steps in second job in case a condition happened in Step 1?
I pasted here the example that I'm trying to fix
version: 2.1
orbs:
workflows:
test-and-deploy:
jobs:
- set-data:
context: my-context
- read-data:
context: my-context
requires:
- set-data
definitions:
node_image: &node-image
docker:
- image: cimg/node:14.15.5
executors:
base-12-14-0:
description: |
Single Docker container with Node 12.14.0 and Cypress dependencies
see https://github.com/cypress-io/cypress-docker-images/tree/master/base.
Use example: `executor: cypress/base-12-14-0`.
docker:
- image: cypress/base:12.14.0
jobs:
set-data:
<<: *node-image
description: Sets the data
steps:
- run: echo "VAR=app" > global-vars
- persist_to_workspace:
root: .
paths:
- global-vars
read-data:
<<: *node-image
description: read the data
steps:
- attach_workspace:
at: .
- run: ls
- run: cat global-vars // I COULD HERE SEE THE CORRECT VAR inside global-vars
- run: cat global-vars >> $BASH_ENV
- run: echo "Test $VAR" // Successfully Printed
- when:
condition:
matches: {
pattern: "app",
value: $VAR
}
steps:
- run: echo "Condition Executed"
It's not possible to use environment variables in logic statements. The reason is that logic statements are evaluated at configuration compilation time, whereas environment variables are interpolated at run time.
The only workaround I know of is to use the CircleCI dynamic configuration functionality to set pipeline parameters' values in the "setup workflow" that you then pass to the "continuation" workflow.
And by the way, you're not using $BASH_ENV correctly (https://circleci.com/docs/env-vars#setting-an-environment-variable-in-a-shell-command). But again, even if you did, you wouldn't be able to use an environment variable in a logic statement.

Conditionally setting parameter in yml file (Azure pipeline): VAR not updating

Problem
I want to set a parameter conditionally based on which branch triggered the pipeline. If the triggered branch was feature/automated-testing, I would like to set a parameter equal to "True". See the code below.
Parts of my pipeline.yml file looks like so:
trigger:
branches:
include:
- feature/automated-testing
...
# Global variables for the pipeline
variables:
- name: "triggerRepoName"
value: "$(Build.SourceBranchName)"
stages:
# common stage. Docker build, tag and push
- stage: BuildDockerImage
displayName: "Build docker image"
variables:
...
jobs:
- template: /templates/pipelines/my-prject.yml#templates
parameters:
${{ if eq( variables.triggerRepoName, 'feature/automated-testing') }}:
runTests: "True"
${{ if ne(variables.triggerRepoName, 'feature/automated-testing') }}:
runTests: "False"
Question
When I push from branch feature/automated-testing and ´echo´ the variable runTests in the Dockerfile, it is blank. Is there something wrong with my syntax in the conditional statement?
I believe the error is in the way the variable is set conditionally, and I have therefore chosen not to supply the Dockerfile nor the other .yml template .yml used.
Please change variables.triggerRepoName to variables['triggerRepoName']. It should solve your issue.

How do we conditionally run a CircleCI workflow?

I have followed the guide described in Conditional steps in jobs and conditional workflows and written the below code for my CircleCI pipeline.
version: 2.1
workflows:
version: 2.1
workflowone:
when:
condition: false
jobs:
- samplejob:
workflowtwo:
when:
condition: true
jobs:
- jobone
jobs:
samplejob:
docker:
- image: buildpack-deps:stable
steps:
- run:
name: Sample Job in WF 1
command: |
echo "This job is in workflowone and the workflow should not run"
jobone:
docker:
- image: buildpack-deps:stable
steps:
- run:
name: Sample Job in WF 2
command: |
echo "This job is in workflowtwo and the workflow should run"
When I run the above code the output is not what is expected. First workflow should not run because the condition is false. Both worflows start running when the pipeline in triggered. Can anyone point out the missing piece here?
According to the CircleCI docs, workflows (specifically) does not accept the condition key:
Note: When using logic statements at the workflow level, do not
include the condition: key (the condition key is only needed for job
level logic statements).
See here logic-statement-examples (scroll to the bottom of this section to see the note)

jenkins-job-builder doesn't propagate a variable value

I am using jenkins-job-builder to create my pipeline project. But I have a problem with the variables values when I am trying to reuse or propagating.
It is my project configuration:
- project:
name: myproject
git_url: git#gitlabserver.cu:demos-products/myproject.git
jobs:
- '{name}-nfr-smoke-tests':
pipeline-next: '{name}-nfr-smoke-tests'
And here is my job-template:
- job-template:
name: "{name}-nfr-smoke-tests"
node: 'slave1'
scm:
- git:
skip-tag: false
url: 'git#gitlabserver.cu:test-products/{name}-nfr-tests.git'
branches:
- master
wipe-workspace: true
builders:
- shell: |
bundle install
bundle exec cucumber features/smoke.feature
publishers:
- trigger:
project: "{pipeline-next}"
threshold: SUCCESS
Ok, now when I run this configuration in jenkins and I check the job's construction, it says:
No such project ‘{name}-nfr-smoke-tests’. Did you mean ‘myproject-nfr-smoke-tests’?
Why the line: pipeline-next: '{name}-nfr-smoke-tests'doesn't propagates the value of variable name and just used it as a literal string? I am missing something.
You are missing 'name' under 'project' section in your job-template. Append the following lines:
- project:
name: project-name
The purpose of a project is to collect related jobs together, and provide values for the variables in a Job Template.
I found out that Jenkins Job Builder version 0.9.0-0.2 does not propagate the value, but for me version 1.3.0+2015.12.15.git136.959eb4b909-1 did. Perhaps updating your version of Jenkins Job Builder might help?

Resources