I want to limit certain step to specific branches
pipelines:
branches:
'{feature/*,release/*,hotfix/*,bugfix/*}':
- step: # Generates credential needed for test and building
script:
- echo "All branch, this step is needed for release"
- parallel:
script: # Some lint and unit test script
- echo "All branch, this step is needed for release step"
'{release/*}':
- step: # Actual test and build script
script:
- echo "Only in release branch"
In the above sample I want the first step to run in almost all branch, but the second step should only execute for release branches pattern. Unfortunately the actual result was the first step only works in branches likehotfix and never gets trigger in release/version-name when branching out. How is it supposedly done? Basically the idea is to reuse the first two steps above on release branches, so no need to duplicate those steps.
I finally managed to do it with YAML anchors
definitions:
steps:
- step: &key-generator # Generates credential needed for test and building
name: Generating Key
script:
- echo "All branch, this step is needed for release"
- parallel: &lint-test
name: Lint check
script: # Some lint and unit test script
- echo "All branch, this step is needed for release step"
pipelines:
branches:
'{feature/*,release/*,hotfix/*,bugfix/*}':
- step: *key-generator # YAML anchor referer
- parallel: *lint-test # YAML anchor referer
'{release/*}':
- step: *key-generator # YAML anchor referer
- parallel: *lint-test # YAML anchor referer
- step: # Actual test and build script
script:
- echo "Only in release branch"
Related
It is possible to declare variables inside the pipeline file, as in this GitHub example:
# ...
env:
NODE_VERSION: 16.3.1
FOLDER_PATH: Project
# ...
steps:
- name: Move to project folder
run: cd $FOLDER_PATH
# ...
Is it possible to do something similar in the bitbucket pipeline files? (How?)
Thanks any help : )
No.
There is a feature request for that https://jira.atlassian.com/browse/BCLOUD-17453 .
Still "gathering interest" though.
The nearest approximation is to write a YAML anchor that exports those vars and use it in every step.
definitions:
yaml-anchors:
- &setenv-script >-
export NODE_VERSION=16.3.1
&& export FOLDER_PATH=Project
pipelines:
default:
- step:
script:
- *setenv-script
- ...
- step:
script:
- *setenv-script
- ...
Please I'm trying to run some steps in the CircleCI Pipeline with conditions happened in the previous step. I tried a lot of tricks like exposing the value from Step 1 to global vars and pickup it in Step 2, I can see and print the variables in Step 2 but using WHEN BLOCK forever evaluated with Empty. I searched a lot and I knew that logical conditions already evaluated before running the jobs, Please I need alternative way to execute steps in second job in case a condition happened in Step 1?
I pasted here the example that I'm trying to fix
version: 2.1
orbs:
workflows:
test-and-deploy:
jobs:
- set-data:
context: my-context
- read-data:
context: my-context
requires:
- set-data
definitions:
node_image: &node-image
docker:
- image: cimg/node:14.15.5
executors:
base-12-14-0:
description: |
Single Docker container with Node 12.14.0 and Cypress dependencies
see https://github.com/cypress-io/cypress-docker-images/tree/master/base.
Use example: `executor: cypress/base-12-14-0`.
docker:
- image: cypress/base:12.14.0
jobs:
set-data:
<<: *node-image
description: Sets the data
steps:
- run: echo "VAR=app" > global-vars
- persist_to_workspace:
root: .
paths:
- global-vars
read-data:
<<: *node-image
description: read the data
steps:
- attach_workspace:
at: .
- run: ls
- run: cat global-vars // I COULD HERE SEE THE CORRECT VAR inside global-vars
- run: cat global-vars >> $BASH_ENV
- run: echo "Test $VAR" // Successfully Printed
- when:
condition:
matches: {
pattern: "app",
value: $VAR
}
steps:
- run: echo "Condition Executed"
It's not possible to use environment variables in logic statements. The reason is that logic statements are evaluated at configuration compilation time, whereas environment variables are interpolated at run time.
The only workaround I know of is to use the CircleCI dynamic configuration functionality to set pipeline parameters' values in the "setup workflow" that you then pass to the "continuation" workflow.
And by the way, you're not using $BASH_ENV correctly (https://circleci.com/docs/env-vars#setting-an-environment-variable-in-a-shell-command). But again, even if you did, you wouldn't be able to use an environment variable in a logic statement.
I have followed the guide described in Conditional steps in jobs and conditional workflows and written the below code for my CircleCI pipeline.
version: 2.1
workflows:
version: 2.1
workflowone:
when:
condition: false
jobs:
- samplejob:
workflowtwo:
when:
condition: true
jobs:
- jobone
jobs:
samplejob:
docker:
- image: buildpack-deps:stable
steps:
- run:
name: Sample Job in WF 1
command: |
echo "This job is in workflowone and the workflow should not run"
jobone:
docker:
- image: buildpack-deps:stable
steps:
- run:
name: Sample Job in WF 2
command: |
echo "This job is in workflowtwo and the workflow should run"
When I run the above code the output is not what is expected. First workflow should not run because the condition is false. Both worflows start running when the pipeline in triggered. Can anyone point out the missing piece here?
According to the CircleCI docs, workflows (specifically) does not accept the condition key:
Note: When using logic statements at the workflow level, do not
include the condition: key (the condition key is only needed for job
level logic statements).
See here logic-statement-examples (scroll to the bottom of this section to see the note)
Recently I've made some configuration on my team's github circleci. I needed to use a when statement to devide ci logics. I referenced this document(https://circleci.com/docs/2.0/configuration-reference/#logic-statements) but it seems the document not correct.
Below is my step definition:
...
image_build_step:
executor: golang_executor
steps:
- checkout
- setup_remote_docker:
version: 18.09.3
docker_layer_caching: true
- define_svc_name:
jobname: ${CIRCLE_JOB} # On this step set $SVC variable
- when:
conditon:
equal: ["${SVC}", "SVC_A" ]
- aws-ecr/build-and-push-image:
repo: SVC_A_REPO
dockerfile: ./Dockerfile
tag: "latest,${CIRCLE_SHA1},build-${CIRCLE_BUILD_NUM}"
...
Also I already tried this.
...
image_build_step:
executor: golang_executor
steps:
- checkout
- setup_remote_docker:
version: 18.09.3
docker_layer_caching: true
- define_svc_name:
jobname: ${CIRCLE_JOB} # On this step set $SVC variable
- when:
equal: ["${SVC}", "SVC_A" ]
- aws-ecr/build-and-push-image:
repo: SVC_A_REPO
dockerfile: ./Dockerfile
tag: "latest,${CIRCLE_SHA1},build-${CIRCLE_BUILD_NUM}"
...
I cannot figure out my mistake using when statement on circleci. Additionaly, I already passed circleci config validate .circleci/config.yaml command before I pushed this commit.
What is the correct usage of when statement in circleci? Joining circleci forum is also annoying me using github account, so I leave my question on stakeoverflow.
It's not possible to use environment variables in logic statements. The reason is that logic statements are evaluated at configuration compilation time, whereas environment variables are interpolated at run time.
The only workaround I know of is to use the CircleCI dynamic configuration functionality to set pipeline parameters' values in the "setup workflow" that you then pass to the "continuation" workflow.
I am trying to run sonarcloud-quality-gate check after performing sonarcloud-scan. I am doing this because I want bitbucket build pipeline should fail if the quality gate check is failed.
Doing this I get some error like this
Quality Gate failed: Could not get scanner report: [Errno 2] No such file or directory: '/opt/atlassian/pipelines/agent/build/.bitbucket/pipelines/generated/pipeline/pipes/sonarsource/sonarcloud-scan/sonarcloud-scan.log'
This is how my bitbucket.yml looks.
image: node:10.15.3
clone:
depth: full # SonarCloud scanner needs the full history to assign issues properly
definitions:
caches:
sonar: ~/.sonar/cache # Caching SonarCloud artifacts will speed up your build
steps:
- step: &build-test-sonarcloud
name: Build, test and analyze on SonarCloud
caches:
- node
- sonar
script:
- npm install --quiet
- npm run test:coverage
- pipe: sonarsource/sonarcloud-scan:0.1.5
variables:
SONAR_TOKEN: ${SONAR_TOKEN}
EXTRA_ARGS: '-Dsonar.sources=src -Dsonar.tests=src -Dsonar.test.inclusions="**.test.jsx" -Dsonar.javascript.lcov.reportPaths=coverage/lcov.info'
- pipe: sonarsource/sonarcloud-quality-gate:0.1.1
variables:
SONAR_TOKEN: ${SONAR_TOKEN}
pipelines:
default:
- step: *build-test-sonarcloud
Although solarcloud-scan pipe runs successfully.
The problem is that the sonarsource/sonarcloud-quality-gate pipe requires a newer version of the sonarsource/sonarcloud-scan pipe. (This was the case ever since the first release of the sonarsource/sonarcloud-quality-gate pipe.)
Change your pipeline configuration like this:
- pipe: sonarsource/sonarcloud-scan:1.0.1
variables:
SONAR_TOKEN: ${SONAR_TOKEN}
EXTRA_ARGS: '-Dsonar.sources=src -Dsonar.tests=src -Dsonar.test.inclusions="**.test.jsx" -Dsonar.javascript.lcov.reportPaths=coverage/lcov.info'
- pipe: sonarsource/sonarcloud-quality-gate:0.1.3
variables:
SONAR_TOKEN: ${SONAR_TOKEN}
An easy way to see the latest versions is in the pipeline editor.
When you edit the bitbucket-pipelines.yml file, a sidebar like this opens,
where you can filter the list by entering "sonar":
And then, click on a pipe to see details, and note the version used.