Error Integration Bitbucket Pipeline and SonarCloud - bitbucket

ALM used Bitbucket Cloud
CI system used Bitbucket Cloud
Languages of the repository: Angular (Other (for JS, TS, Go, Python, PHP, …))
Error observed
ERROR: Error during SonarScanner execution
ERROR: Not authorized. Please check the property sonar.login or SONAR_TOKEN env variable
Steps to reproduce
SONAR_TOKEN already generated and added to my ENV_VAR
Bitbucket.yaml
image: ‘node:12.22’
clone:
depth: full # SonarCloud scanner needs the full history to assign issues properly
definitions:
caches:
sonar: ~/.sonar/cache # Caching SonarCloud artifacts will speed up your build
steps:
step: &build-test-sonarcloud
name: Build, test and analyze on SonarCloud
caches:
- sonar
script:
- pipe: sonarsource/sonarcloud-scan:1.2.1
variables:
EXTRA_ARGS: ‘-Dsonar.host.url=https://sonarcloud.io -Dsonar.login=${SONAR_TOKEN}’
step: &check-quality-gate-sonarcloud
name: Check the Quality Gate on SonarCloud
script:
- pipe: sonarsource/sonarcloud-quality-gate:0.1.4
pipelines:
branches
Potential workaround
No idea.

if you already install the sonar cloud app to your workspace environment, there is no need to give the sonar url again. The integration process is handling the URL part. Also, you should add your Sonar token variable to Workspace or repo environment. After that, you should login to Sonar Cloud organization account and bind your repo to SonarCloud to be able to evaluate it by Sonar Cloud. Here is my Sonar Cloud setup;
bitbucket-pipelines.yml file,
image:
name: <base image>
clone:
# SonarCloud scanner needs the full history to assign issues properly
depth: full
definitions:
caches:
# Caching SonarCloud artifacts will speed up your build
sonar: ~/.sonar/cache
pipelines:
pull-requests:
'**':
- step:
name: "Code Quality and Security on PR"
script:
- pipe: sonarsource/sonarcloud-scan:1.2.1
variables:
SONAR_TOKEN: '$SONAR_CLOUD_TOKEN'
SONAR_SCANNER_OPTS: -Xmx512m
DEBUG: "true"
branches:
master:
- step:
name: "Code Quality and Security on master"
script:
- pipe: sonarsource/sonarcloud-scan:1.2.1
variables:
SONAR_TOKEN: '$SONAR_CLOUD_TOKEN'
SONAR_SCANNER_OPTS: -Xmx512m
DEBUG: "true"
tags:
'*.*.*-beta*':
- step:
name: "Image Build & Push"
services:
- docker
caches:
- docker
clone:
depth: 1
script:
- <build script>
- step:
name: "Deploy"
deployment: beta
clone:
enabled: false
script:
- <deploy script>
'*.*.*-prod':
- step:
name: "Image Build & Push"
services:
- docker
caches:
- docker
clone:
depth: 1
script:
- <build script>
- step:
name: "Deploy"
deployment: prod
clone:
enabled: false
script:
- <deploy script>
sonar-project.properties file,
sonar.organization=<sonar cloud organization name>
sonar.projectKey=<project key>
sonar.projectName=<project name>
sonar.sources=<sonar evaluation path>
sonar.language=<repo language>
sonar.sourceEncoding=UTF-8

Related

how to use same environement varibale in multiple steps in bitbucket pipelines

Below script giving an error
The deployment environment 'staging' in your bitbucket-pipelines.yml file occurs multiple times in the pipeline. Please refer to our documentation for valid environments and their ordering.
image: python:3.8
options:
docker: true
pipelines:
branches:
master:
- step:
deployment: staging
name: Setup stage
script:
- echo ${db_name}
- step:
deployment: staging
name: Setup cli prod
script:
- echo ${db_name}
- step:
deployment: staging
name: Setup cli sandbox
script:
- echo ${db_name}
I want to use same environment variable (staging) in all steps of my pipeline. Please guide me how to do this.
This can't be done because steps marked as deployments must be unique.
Deployment variables are those needed to deploy the VCS to a particular deployment stage. If you are setting up administrative tasks as pipelines, those are NOT deployments.
Or if those are actual deployments, abide by the error message you are getting and make sure the declared deployment stage is unique:
image: python:3.8
options:
docker: true
definitions:
yaml-anchors:
- &deploy-step
script:
- echo ${db_name}
pipelines:
branches:
master:
- step:
<<: *deploy-step
deployment: staging
name: Deploy staging
- step:
<<: *deploy-step
deployment: production
name: Deploy prod
- step:
deployment: sandbox
name: Deploy sandbox

how to set bitbucket pipelines to be manually triggered?

i wrote a pipeline in bitbucket environment but i would like the pipeline to be triggered only when the user run it and not automatically on push or commit.
here is the code:
pipelines:
branches:
new_ui_apk:
- step:
name: Build apk
size: 2x
script:
- JAVA_OPTS="-Xmx2048m -XX:MaxPermSize=2048m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8"
- docker build -t app-release:1.0.0 .
services:
- docker
definitions:
services:
docker:
memory: 7128
actually i use the skip ci tip to avoid it but if another team member push or commit any change, the pipeline will run, how else can i avoid it please?
if you mention the definition under "custom" property it stops listening branches and only acts when a user triggers it.
use this.
pipelines:
custom:
new_ui_apk:
- step:
name: Build apk
size: 2x
script:
- JAVA_OPTS="-Xmx2048m -XX:MaxPermSize=2048m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8"
- docker build -t app-release:1.0.0 .
services:
- docker
definitions:
services:
docker:
memory: 7128
The Answer is not good you only need to add trigger: manual
-step
image: XXX
name: XXXX
deployment: XXXX
trigger: manual
script:
- whatever....
And it will be shown a option to be run inside the pipeline options.

How to configure Gitlab Runner to connect to Artifactory?

I am trying to set Gitlab runner to connect to Artifactory and pull images.My yml file to set RUnner looks like below :
gitlabUrl: https://gitlab.bayer.com/
runnerRegistrationToken: r*******-
rbac:
create: false
serviceAccountName: iz-sai-s
serviceAccount.name: iz-sai-s
runners:
privileged: true
resources:
limits:
memory: 32000Mi
cpu: 4000m
requests:
memory: 32000Mi
cpu: 2000m
What changes are needed to configure my runner properly to connect to Artifactory URL and pull images from there ?
This is an example where my runner runs as docker container with an image having artifactory cli configured in it, so in your case your runner should have jfrog cli configured , next it needs an api key to access artifactory which you ll generate in artifactory and store in gitlab like below picture , exact path would be your repo - settings - CICD - variables
First it authenticates then uploads
publish_job:
stage: publish_artifact
image: xxxxxplan/jfrog-cli
variables:
ARTIFACTORY_BASE_URL: https://xxxxx.com/artifactory
REPO_NAME: my-rep
ARTIFACT_NAME: my-artifact
script:
- jfrog rt c --url="$ARTIFACTORY_BASE_URL"/ --apikey="$ARTIFACTORY_KEY"
- jfrog rt u "target/demo-0.0.1-SNAPSHOT.jar" "$REPO_NAME"/"$ARTIFACT_NAME"_"$CI_PIPELINE_ID.jar" --recursive=false
Mark the answer as accepted if it fulfils your requirement
Also make sure to use indentation in your question which is not there
Edit 1 : Adding the whole gitlab_ci.yml
stages:
- build_unittest
- static_code_review
- publish_artifact
image: maven:3.6.1-jdk-8-alpine
cache:
paths:
- .m2/repository
- target/
variables:
MAVEN_OPTS: "-Dmaven.repo.local=.m2/repository"
build_unittest_job:
stage: build_unittest
script: 'mvn clean install'
tags:
- my-docker
artifacts:
paths:
- target/*.jar
expire_in: 20 minutes
when: manual
code_review_job:
stage: static_code_review
variables:
SONARQUBE_BASE_URL: https://xxxxxx.com
script:
- mvn sonar:sonar -Dsonar.projectKey=xxxxxx -Dsonar.host.url=https://xxxxx -Dsonar.login=xxxxx
tags:
- my-docker
cache:
paths:
- /root/.sonar/cache
- target/
- .m2/repository
when: manual
publish_job:
stage: publish_artifact
image: plan/jfrog-cli
variables:
ARTIFACTORY_BASE_URL: https://xxxx/artifactory
REPO_NAME: maven
ARTIFACT_NAME: myart
script:
- jfrog rt c --url="$ARTIFACTORY_BASE_URL"/ --apikey="$ARTIFACTORY_KEY"
- jfrog rt u "target/demo-SNAPSHOT.jar" "$REPO_NAME"/"$ARTIFACT_NAME"_"$CI_PIPELINE_ID.jar" --recursive=false
tags:
- my-docker
when: manual

Bitbucket Pipeline fails saying that step is empty, null or missing

I'm trying to configure a Bitbucket pipeline to execute the SonarQube pipe, but Bitbucket complains that the pipeline step is empty, null or missing.
I've got SONAR_TOKEN defined as a project variable with the correct token.
Here's my current bitbucket-pipeline.yml file:
image: atlassian/default-image:2
clone:
depth: full
definitions:
caches:
sonar: ~/.sonar/cache
steps:
- step: &sonarcloud
name: Analyze on SonarCloud
caches:
- sonar
script:
- pipe: sonarsource/sonarcloud-scan:0.1.5
variables:
SONAR_TOKEN: ${SONAR_TOKEN}
pipelines:
branches:
'*':
- step: *sonarcloud
Any ideas?
Found the issue.
The problem is that the step details in the definition area is incorrectly indented and is missing one extra indentation level.
Instead of:
...
- steps: &sonarcloud
name: ...
...
It's
...
- steps: &sonarcloud
name: ... // Notice the extra level of indentation
...
The correct YAML is:
image: atlassian/default-image:2
clone:
depth: full
definitions:
caches:
sonar: ~/.sonar/cache
steps:
- step: &sonarcloud
name: Analyze on SonarCloud
caches:
- sonar
script:
- pipe: sonarsource/sonarcloud-scan:0.1.5
variables:
SONAR_TOKEN: ${SONAR_TOKEN}
pipelines:
branches:
'*':
- step: *sonarcloud

Bitbucket Pipelines share SOME steps between branches

Is it possible to share steps between branches and still run branch specific steps? For example, the develop and release branch has the same build process, but uploaded to separate S3 buckets.
pipelines:
default:
- step:
script:
- cd source
- npm install
- npm build
develop:
- step:
script:
- s3cmd put --config s3cmd.cfg ./build s3://develop
staging:
- step:
script:
- s3cmd put --config s3cmd.cfg ./build s3://staging
I saw this post (Bitbucket Pipelines - multiple branches with same steps) but it's for the same steps.
Use YAML anchors:
definitions:
steps:
- step: &Test-step
name: Run tests
script:
- npm install
- npm run test
- step: &Deploy-step
name: Deploy to staging
deployment: staging
script:
- npm install
- npm run build
- fab deploy
pipelines:
default:
- step: *Test-step
- step: *Deploy-step
branches:
master:
- step: *Test-step
- step:
<<: *Deploy-step
name: Deploy to production
deployment: production
trigger: manual
Docs: https://confluence.atlassian.com/bitbucket/yaml-anchors-960154027.html
Although it's not officially supported yet, you can pre-define steps now.
You can use yaml anchors.
I got this tip from bitbucket staff when I had an issue running the same steps across a subset of branches.
definitions:
step: &Build
name: Build
script:
- npm install
- npm build
pipelines:
default:
- step: *Build
branches:
master:
- step: *Build
- step:
name: deploy
# do some deploy from master only
I think Bitbucket can't do it. You can use one pipeline and check the branch name:
pipelines:
default:
- step:
script:
- cd source
- npm install
- npm build
- if [[ $BITBUCKET_BRANCH = develop ]]; then s3cmd put --config s3cmd.cfg ./build s3://develop; fi
- if [[ $BITBUCKET_BRANCH = staging ]]; then s3cmd put --config s3cmd.cfg ./build s3://staging; fi
The two last lines will be executed only on the specified branches.
You can define and re-use steps with YAML Anchors.
anchor & to define a chunk of configuration
alias * to refer to that chunk elsewhere
And the source branch is saved in a default variable called BITBUCKET_BRANCH
You'd also need to pass the build results (in this case the build/ folder) from one step to the next, which is done with artifacts.
Combining all three will give you the following config:
definitions:
steps:
- step: &build
name: Build
script:
- cd source
- npm install
- npm build
artifacts: # defining the artifacts to be passed to each future step.
- ./build
- step: &s3-transfer
name: Transfer to S3
script:
- s3cmd put --config s3cmd.cfg ./build s3://${BITBUCKET_BRANCH}
pipelines:
default:
- step: *build
develop:
- step: *build
- step: *s3-transfer
staging:
- step: *build
- step: *s3-transfer
You can now also use glob patterns as mentioned in the referenced post and steps for both develop and staging branches in one go:
"{develop,staging}":
- step: *build
- step: *s3-transfer
Apparently it's in the works. Hopefully available soon.
https://bitbucket.org/site/master/issues/12750/allow-multiple-steps?_ga=2.262592203.639241276.1502122373-95544429.1500927287

Resources