How to concat two variables in bitbucket pipeline yml file - bitbucket

I wanted to create a new variable in bitbucket pipeline whose value is going to concatenation of two bitbucket tokens, so I tried this but the new value is not working.
image: node:16.13.0
SOME_VALUE: $(date +%Y-%m-%d)-${BITBUCKET_BUILD_NUMBER}
branches:
develop:
- step:
name: Build
size: 2x
script:
- echo ${SOME_VALUE}
Appreciate any help on this!

This may not work because your variable is not in a step. In Bitbucket Pipelines, each step has its own build environment. You are trying to create a dynamic variable using date, so identify it in a step. If you want to use this variable in multiple steps, you can use artifacts.
image: node:16.13.0
pipelines:
branches:
develop:
- step:
name: Build
size: 2x
script:
- SOME_VALUE=$(date +%Y-%m-%d)-${BITBUCKET_BUILD_NUMBER}
- echo ${SOME_VALUE}

Related

With CircleCI, is it possible to share an executor between two jobs

I am rewriting my CircleCI config. Everything was put in only one job and everything was working well, but for some good reasons I want more structure.
Now I have two jobs build and test, and I want the second job to reuse the machine exactly where the build job stopped.
I will later have a third and four job.
My desire would be a line that says I want to reuse the previous machine/executor, built-in from CircleCI.
Other options are Workspaces that save data on CircleCI machine, or building and deploying my own docker that represents the machine after the build job
What is the easiest way to achieve what I want to do ?
Currently, I have basically in my yaml:
jobs:
build:
docker:
- image: cypress/base:14.16.0
steps:
- checkout
- node/install:
install-yarn: true
node-version: '16.13'
- other-long-commands
test:
# NOT GOOD: need an executor
steps:
- run:
name: 'test'
command: 'npx cypress run'
environment:
TEST_SUITE: SMOKE
workflows:
build-and-test:
jobs:
- build
- smoke:
requires:
- build
Can't be done. Workspaces is the solution instead.
My follow up would be, why do you need two jobs? Depending on your use case, pulling steps out into reusable commands might help, or even an orb.

Is it possible to make a defined step conditional when using it in a Bitbucket pipeline?

I have a monorepo with a bitbucket pipeline.
I want to be able to run a default build whenever I push that only runs the steps for projects in the monorepo that have changed, using a step definition for each project. But I want to be able to run customs builds for specific environments that run for every project using the same step definitions.
If I define a step that I want to use in several places, e.g.
definitions:
steps:
- step: &ExampleProjectBuildStep
name: Example Project Build Step
script:
- echo 'Example project build step'
- step: &ExampleProjectBuildStep2
name: Example Project Build Step 2
script:
- echo 'Example project build step 2'
I'd like to be able to run a parallel conditional default build:
pipelines:
default:
- parallel
- step: *ExampleProjectBuildStep
condition:
changesets:
includePaths:
- "example_path/**"
- step: *ExampleProjectBuildStep2
condition:
changesets:
includePaths:
- "example_path_2/**"
example_custom_pipeline:
- step: *ExampleProjectBuildStep
- step: *ExampleProjectBuildStep2
I also want to use the defined step in custom/branch pipeline builds without the condition.
I have a separate monorepo project which is simpler, so I haven't defined the steps there, and the parallel conditional steps work as expected. Is is just not possible to have conditional steps that use a step definition in bitbucket without including the conditional in the definition and thus requiring two definitions, one conditional and one unconditional?
None of the documentation I've found that covers conditional steps mentions step definitions, and vice versa. I can't find any info on whether this should be possible, but it seems like a surprising oversight if it isn't.
I've tried to make this as clear as possible, but if anything is unclear please highlight and I will try to better explain what I mean.
It is possible to override some of the attributes of an anchor definition with YAML operators.
The way I resolve this in monorepos is usually:
definitions:
yaml-anchors:
- &some-step
name: A step
condition:
changesets:
includePaths:
- subproject/**
script:
- echo "example"
pipelines:
default:
- step: *some-step
custom:
example:
- step:
<<: *some-step
condition: null

How to run the same Bitbucket Pipeline with different environment variables for different branches?

I have a monorepo project that is deployed to 3 environments - testing, staging and production. Deploys to testing come from the next branch, while staging and production from the master branch. Testing deploys should run automatically on every commit to next (but I'm also fine with having to trigger them manually), but deploys from the master branch should be triggered manually. In addition, every deploy may consist of a client push and server push (depending on the files changed). The commands to deploy to each of the hosts are exactly the same, the only thing changing is the host itself and the environment variables.
Therefore I have 2 questions:
Can I make Bitbucket prompt me the deployment target when I manually trigger the pipeline, thus basically letting me choose the set of the env variables to inject into the set sequence of commands? I've seen a screenshot for this in a tutorial, but I lost it and can't find it since.
Can I have parallel sequences of commands? I'd like the server and the client push to run simultaneously, but both of them have different steps. Or do I need to merge those into the same step with multiple scripts to achieve that?
Thank you for your help.
The answer to both of your questions is 'Yes'.
The feature that makes it possible is called custom pipelines. Here is a neat doc that demonstrates how to use them.
There is a parallel keyword which you can use to define parallel steps. Check out this doc for details.
If I'm not misinterpreting the description of your setup, your final pipeline should look very similar to this:
pipelines:
custom:
deploy-to-staging-or-prod: # As you say the steps are the same, only variable values will define the destination.
- variables: # List variable names under here, and Bitbucket will prompt you to supply their values.
- name: VAR1
- name: VAR2
- parallel:
- step:
- ./deploy-client.sh
- step:
- ./deploy-server.sh
branches:
next:
- step:
script:
- ./deploy-to-testing.sh
UPD
If you need to use Deployments instead of providing each variable separately, use can utilise manual type of trigger:
definitions:
steps:
- step: &RunTests
script:
- ./run-tests.sh
- step: &DeployFromMaster
script:
- ./deploy-from-master.sh
pipelines:
branches:
next:
- step:
script:
- ./deploy-to-testing.sh
master:
- step: *RunTests
- parallel:
- step:
<<: *DeployFromMaster
deployment: staging
trigger: manual
- step:
<<: *DeployFromMaster
deployment: production
trigger: manual
Key docs for understanding this pipeline is still this one and this one for yaml anchors. Keep in mind that I introduced a 'RunTests' step on purpose, as
Since a pipeline is triggered on a commit, you can't make the first step manual.
It will act as a stopper for the deploy step which can only be manual due to your requirements.

Exporting environment variables from one stage to the next in GitLab CI

Is there a way to export environment variables from one stage to the next in GitLab CI? I'm looking for something similar to the job artifacts feature, only for environment variables instead of files.
Let's say I'm configuring the build in a configure stage and want to store the results as (secret, protected) environment variables for the next stages to use. I could safe the configuration in files and store them as job artifacts but I'm concerned about secrets being made available in files than can be downloaded by everyone.
Since Gitlab 13 you can inherit environment variables like this:
build:
stage: build
script:
- echo "BUILD_VERSION=hello" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy
script:
- echo $BUILD_VERSION # => hello
dependencies:
- build
Note: for GitLab < 13.1 you should enable this first in Gitlab Rails console:
Feature.enable(:ci_dependency_variables)
Although not exactly what you wanted since it uses artifacts:reports:dotenv artifacts, GitLab recommends doing the below in their guide: 'Pass an environment variable to another job':
build:
stage: build
script:
- echo "BUILD_VERSION=hello" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy
script:
- echo "$BUILD_VERSION" # Output is: 'hello'
needs:
- job: build
artifacts: true
I believe using the needs keyword is preferable over the dependencies keyword (as used in hd-deman`'s top answer) since:
When a job uses needs, it no longer downloads all artifacts from previous stages by default, because jobs with needs can start before earlier stages complete. With needs you can only download artifacts from the jobs listed in the needs: configuration.
Furthermore, you could minimise the risk by setting the build's artifacts:expire_in time to be very small.
No this feature is not here yet, but there is already an issue for this topic.
My suggestion would be that you are saving the variables in a files and cache them, as these will be not downloadable and will be removed on finish of the job.
If you want to be 100% sure you can delete it manually. See the clean_up stage.
e.g.
cache:
paths:
- save_file
stages:
- job_name_1
- job_name_2
- clean_up
job_name_1:
script:
- (your_task) >> save_file
job_name_2:
script:
- cat save_file | do_something_with_content
clean_up:
script:
- rm save_file
when: always
You want to use Artefacts for this.
stages:
- job_name_1
- job_name_2
- clean_up
job_name_1:
script:
- (your_task) >> save_file
artifacts:
paths:
- save_file
# Hint: You can set an expiration for them too.
job_name_2:
needs:
- job: job_name_1
artifacts: true
script:
- cat save_file | do_something_with_content

Travis CI: branch filters in build matrix items

We are wondering whether there is any way to add filters to Travis matrix items. In our particular case, we wish to run certain jobs only on specific branches.
The following example would be an ideal way for configuring this scenario, however it doesn't seem to work:
matrix:
include:
- env: BUILD_TYPE=release
branches:
only:
- master
- env: BUILD_TYPE=ci
branches:
only:
- develop
As a workaround, we can exit from the build script immediately by checking the appropriate env vars (TRAVIS_BRANCH), but it is very far from ideal as launching the slave machine and cloning the repo takes a considerable amount of time.
You can now achieve this with the beta feature Conditional Build Stages
jobs:
include:
- stage: release
if: branch = master
env: BUILD_TYPE=release
- stage: ci
if: branch = develop
env: BUILD_TYPE=ci

Resources