Is it possible to make a defined step conditional when using it in a Bitbucket pipeline? - bitbucket

I have a monorepo with a bitbucket pipeline.
I want to be able to run a default build whenever I push that only runs the steps for projects in the monorepo that have changed, using a step definition for each project. But I want to be able to run customs builds for specific environments that run for every project using the same step definitions.
If I define a step that I want to use in several places, e.g.
definitions:
steps:
- step: &ExampleProjectBuildStep
name: Example Project Build Step
script:
- echo 'Example project build step'
- step: &ExampleProjectBuildStep2
name: Example Project Build Step 2
script:
- echo 'Example project build step 2'
I'd like to be able to run a parallel conditional default build:
pipelines:
default:
- parallel
- step: *ExampleProjectBuildStep
condition:
changesets:
includePaths:
- "example_path/**"
- step: *ExampleProjectBuildStep2
condition:
changesets:
includePaths:
- "example_path_2/**"
example_custom_pipeline:
- step: *ExampleProjectBuildStep
- step: *ExampleProjectBuildStep2
I also want to use the defined step in custom/branch pipeline builds without the condition.
I have a separate monorepo project which is simpler, so I haven't defined the steps there, and the parallel conditional steps work as expected. Is is just not possible to have conditional steps that use a step definition in bitbucket without including the conditional in the definition and thus requiring two definitions, one conditional and one unconditional?
None of the documentation I've found that covers conditional steps mentions step definitions, and vice versa. I can't find any info on whether this should be possible, but it seems like a surprising oversight if it isn't.
I've tried to make this as clear as possible, but if anything is unclear please highlight and I will try to better explain what I mean.

It is possible to override some of the attributes of an anchor definition with YAML operators.
The way I resolve this in monorepos is usually:
definitions:
yaml-anchors:
- &some-step
name: A step
condition:
changesets:
includePaths:
- subproject/**
script:
- echo "example"
pipelines:
default:
- step: *some-step
custom:
example:
- step:
<<: *some-step
condition: null

Related

Avoid trigger Bitbucket pipeline when the title starts with Draft or WIP

To automate our CI process, I need run the Bitbucket pipelines only when the title not starts with "Draft" or "WIP". Atlassian has only this features https://support.atlassian.com/bitbucket-cloud/docs/use-glob-patterns-on-the-pipelines-yaml-file/.
I tried with the regex ^(?!Draft:|WIP:).+ like this:
pipelines:
pull-requests:
'^(?!Draft:|WIP:).+':
- step:
name: Tests
but the pipeline not start under any circumstances (with or withour Draft:/WIP:). Any suggestions?
Note the PR pattern you define in the pipelines is matched against the source branch, not the PR title. Precisely, I used to feature an empty pipeline for PRs from wip/* branches, e.g.
pipelines:
pull-requests:
wip/*:
- step:
name: Pass
script:
- exit 0
"**":
- step:
name: Tests
# ...
But this workflow requires you to work on wip/* branches and changing their source branch later on. This is somewhat cumbersome and developers just did not opt-in.
This works, though.

With CircleCI, is it possible to share an executor between two jobs

I am rewriting my CircleCI config. Everything was put in only one job and everything was working well, but for some good reasons I want more structure.
Now I have two jobs build and test, and I want the second job to reuse the machine exactly where the build job stopped.
I will later have a third and four job.
My desire would be a line that says I want to reuse the previous machine/executor, built-in from CircleCI.
Other options are Workspaces that save data on CircleCI machine, or building and deploying my own docker that represents the machine after the build job
What is the easiest way to achieve what I want to do ?
Currently, I have basically in my yaml:
jobs:
build:
docker:
- image: cypress/base:14.16.0
steps:
- checkout
- node/install:
install-yarn: true
node-version: '16.13'
- other-long-commands
test:
# NOT GOOD: need an executor
steps:
- run:
name: 'test'
command: 'npx cypress run'
environment:
TEST_SUITE: SMOKE
workflows:
build-and-test:
jobs:
- build
- smoke:
requires:
- build
Can't be done. Workspaces is the solution instead.
My follow up would be, why do you need two jobs? Depending on your use case, pulling steps out into reusable commands might help, or even an orb.

How to run the same Bitbucket Pipeline with different environment variables for different branches?

I have a monorepo project that is deployed to 3 environments - testing, staging and production. Deploys to testing come from the next branch, while staging and production from the master branch. Testing deploys should run automatically on every commit to next (but I'm also fine with having to trigger them manually), but deploys from the master branch should be triggered manually. In addition, every deploy may consist of a client push and server push (depending on the files changed). The commands to deploy to each of the hosts are exactly the same, the only thing changing is the host itself and the environment variables.
Therefore I have 2 questions:
Can I make Bitbucket prompt me the deployment target when I manually trigger the pipeline, thus basically letting me choose the set of the env variables to inject into the set sequence of commands? I've seen a screenshot for this in a tutorial, but I lost it and can't find it since.
Can I have parallel sequences of commands? I'd like the server and the client push to run simultaneously, but both of them have different steps. Or do I need to merge those into the same step with multiple scripts to achieve that?
Thank you for your help.
The answer to both of your questions is 'Yes'.
The feature that makes it possible is called custom pipelines. Here is a neat doc that demonstrates how to use them.
There is a parallel keyword which you can use to define parallel steps. Check out this doc for details.
If I'm not misinterpreting the description of your setup, your final pipeline should look very similar to this:
pipelines:
custom:
deploy-to-staging-or-prod: # As you say the steps are the same, only variable values will define the destination.
- variables: # List variable names under here, and Bitbucket will prompt you to supply their values.
- name: VAR1
- name: VAR2
- parallel:
- step:
- ./deploy-client.sh
- step:
- ./deploy-server.sh
branches:
next:
- step:
script:
- ./deploy-to-testing.sh
UPD
If you need to use Deployments instead of providing each variable separately, use can utilise manual type of trigger:
definitions:
steps:
- step: &RunTests
script:
- ./run-tests.sh
- step: &DeployFromMaster
script:
- ./deploy-from-master.sh
pipelines:
branches:
next:
- step:
script:
- ./deploy-to-testing.sh
master:
- step: *RunTests
- parallel:
- step:
<<: *DeployFromMaster
deployment: staging
trigger: manual
- step:
<<: *DeployFromMaster
deployment: production
trigger: manual
Key docs for understanding this pipeline is still this one and this one for yaml anchors. Keep in mind that I introduced a 'RunTests' step on purpose, as
Since a pipeline is triggered on a commit, you can't make the first step manual.
It will act as a stopper for the deploy step which can only be manual due to your requirements.

How to concat two variables in bitbucket pipeline yml file

I wanted to create a new variable in bitbucket pipeline whose value is going to concatenation of two bitbucket tokens, so I tried this but the new value is not working.
image: node:16.13.0
SOME_VALUE: $(date +%Y-%m-%d)-${BITBUCKET_BUILD_NUMBER}
branches:
develop:
- step:
name: Build
size: 2x
script:
- echo ${SOME_VALUE}
Appreciate any help on this!
This may not work because your variable is not in a step. In Bitbucket Pipelines, each step has its own build environment. You are trying to create a dynamic variable using date, so identify it in a step. If you want to use this variable in multiple steps, you can use artifacts.
image: node:16.13.0
pipelines:
branches:
develop:
- step:
name: Build
size: 2x
script:
- SOME_VALUE=$(date +%Y-%m-%d)-${BITBUCKET_BUILD_NUMBER}
- echo ${SOME_VALUE}

CircleCI: A job that requires a human to start

I'm iterating on adding database migrations to a project. For the first step, I've made a repository that runs migrations. Now I need to make it so these migrations run against the stage/prod environment. I do not want this to happen on every commit. Does circle ci provide a way to have a button that I can click on to run a job?
I think ideally I'd have 2 buttons. One for running migrations on stage, one for running them on prod. Is this possible?
There is a manual approval process for workflows.
https://circleci.com/docs/2.0/workflows/#holding-a-workflow-for-a-manual-approval
workflows:
version: 2
build-test-and-approval-deploy:
jobs:
- build
- test1:
requires:
- build
- test2:
requires:
- test1
- hold:
type: approval
requires:
- test2
- deploy:
requires:
- hold
It's pretty limited. You can't use it to start a build.

Resources