Similar to Jenkins Groovy file. Is there Any file for Bamboo? - jenkins

Im totally new to this Devops field basically for Jenkins, Groovy file is used to maintain preparation-build-Deploy, Similarly for Bamboo which script is used?
I got to know bamboo plan is used. But how the plan is generated though any script or any file.
And i have pipeline for Jenkins similarly how it can be done for Bamboo plan.
the groovy file for Jenkins is
node {
stage('Preparation') { // for display purposes
// Get EDM code from a GitHub repository
cleanWs()
checkout scm
sh "python $WORKSPACE/common/deployment_scripts/abc.py --localFolder $WORKSPACE --env dev"
}
stage('Build') {
// Run the maven build
sh "mvn clean install -f $WORKSPACE/pom.xml -Dmaven.test.skip=true"
}
stage('Deploy') {
//Run the deployment script
sh "python $WORKSPACE/common/deployment_scripts/ase.py $WORKSPACE lm-edm-builds-ndev ${env.BUILD_NUMBER} dev"
sh "python $WORKSPACE/common/deployment_scripts/qwert.py --JsonParameterFile $WORKSPACE/common/deployment_scripts/my_properties.json --BuildVersion ${env.BUILD_NUMBER} --WorkSpace $WORKSPACE --environment dev"
}
}

For Bamboo, you can do so with Bamboo Specs. The Bamboo Specs allows you to define Bamboo configuration as code, and have corresponding plans/deployments created or updated automatically in Bamboo. Read more about the Bamboo Specs here.
Bamboo Specs recognize two ways of creating plans, with Java or YAML. Select the one that matches your needs best. The syntax for both can be found in their official reference documentation.
A sample YAML Specs to define a plan can look like below as detailed in this page:
---
version: 2
plan:
project-key: MARS
key: ROCKET
name: Build the rockets
# List of plan's stages and jobs
stages:
- Build the rocket stage:
- Build
#Job definition
Build:
tasks:
- script:
- mkdir -p falcon/red
- echo wings > falcon/red/wings
- sleep 1
- echo 'Built it'
- test-parser:
type: junit
test-results: '**/junit/*.xml'
# Job's requirements
requirements:
- isRocketFuel
# Job's artifacts. Artifacts are shared by default.
artifacts:
- name: Red rocket built
pattern: falcon/red/wings
You may start with this tutorial for Creating a simple plan with Bamboo Java Specs

Related

How to use a Jenkinsfile for these build steps?

I'm learning how to use Jenkins and working on configuring a Jenkins file instead of the build using the Jenkins UI.
The source code management step for building from Bitbucket:
The build step for building a Docker container:
The build is of type multi configuration project:
Reading the Jenkins file documentation at https://www.jenkins.io/doc/book/pipeline/jenkinsfile/index.html and creating a new build using Pipeline :
I'm unsure how to configure the steps I've configured via the UI: Source Code Management & Build. How to convert the config for Docker and Bitbucket that can be used with a Jenkinsfile ?
The SCM will not be changed, regardless if you are using UI configuration or a pipeline, although in theory you can do the git clone from the steps in the pipeline, if you really insist convert the SCM steps in pure pipeline steps.
The pipeline will can have multiple stages, and each of the stages can have different execution environment. You can use the Docker pipeline plug-in, or you can use plain sh to issue the docker commands on the build agent.
Here is small sample from one of my manual build pipelines:
pipeline {
agent none
stages {
stage('Init') {
agent { label 'docker-x86' }
steps {
checkout scm
sh 'docker stop demo-001c || true'
sh 'docker rm demo-001c || true'
}
}
stage('Build Back-end') {
agent { label 'docker-x86' }
steps {
sh 'docker build -t demo-001:latest ./docker'
}
}
stage('Test') {
agent {
docker {
label 'docker-x86'
}
}
steps {
sh 'docker run --name demo-001c demo-001:latest'
sh 'cd test && make test-back-end'
}
}
}
}
You need to create a Pipeline type of a project and specify the SCM configuration in the General tab. In the Pipeline tab, you will have option to select Pipeline script or Pipeline script from SCM. It's always better to start with the Pipeline script while you are building and modifying your workflow. Once it's stabilized, you can add it to the repository.

Triggering the Jenkins job from the GitLab pipeline stage and on successfully completion of the job move to next stage

Can you please help, I have the following scenario and I went through many videos, blogs but could not find anything matching with my use-case
Requirement:
To write a CI\CD pipeline in GitLab, which can facilitate the following stages in this order
- verify # unit test, sonarqube, pages
- build # package
- publish # copy artifact in repository
- deploy # Deploy artifact on runtime in an test environment
- integration # run postman\integration tests
All other stages are fine and working but for the deploy stage, because of a few restrictions I have to submit an existing Jenkins job using Jenkin remote API with the following script but the problem that script returns an asynchronous response and start the Jenkins job and deploy stage completes and it moves to next stage (integration).
Run Jenkins Job:
image: maven:3-jdk-8
tags:
- java
environment: development
stage: deploy
script:
- artifact_no=$(grep -m1 '<version>' pom.xml | grep -oP '(?<=>).*(?=<)')
- curl -X POST http://myhost:8081/job/fpp/view/categorized/job/fpp_PREP_party/build --user mkumar:1121053c6b6d19bf0b3c1d6ab604f22867 --data-urlencode json="{\"parameter\":[{\"name\":\"app_version\",\"value\":\"$artifact_no\"}]}"
Note: Using GitLab CE edition and Jenkins CI project service is not available.
I am looking for a possible way of triggering the Jenkins job from the pipeline and only on successful completion of the Jenkins job my integration stage starts executing.
Thanks for the help!
Retrieving the status of a Jenkins job that is triggered programmatically through the remote access API is notorious for not being quite convoluted.
Normally you would expect to receive in the response header, under the Location attribute, a url that you can poll to get the status of your request, but unfortunately there are some in-between steps to reach that point. You can find a guide in this post. You may also have a look in this older post.
Once you have the url, you can pool and parse the status job and either sh "exit 1" or sh "exit 0" in your script to force the job that is invoking the external job to fail or succeed, depending on how you want to assert the result of the remote job

Jenkins Using result artifacts in different steps - stash and unstash

I have a Jenkinsfile declarative pipeline which has two steps:
build an RPM file inside a docker container
build a docker image with the RPM and run it
The first step is built inside a docker container because it require a specific app to build the RPM.
The second step is run directly on a Jenkins slave, can be other slave than the slave which ran the first step.
In order to use the RPM produced by the first step I'm currently using stash and unstash steps. If I do not use them the second step doesn't have access to the RPM file.
The RPM file is about 215MB which is more than the 100MB recommended limit so I'll like to know if there is a better solution?
pipeline {
agent any
options {
timestamps()
}
stages {
stage('Gradle: build') {
agent {
docker {
image 'some-internal-image'
}
}
steps {
sh """
chmod +x gradlew
./gradlew buildRpm
"""
}
post {
success {
stash name: 'rpm', includes: 'Server/target/myapp.rpm'
}
}
}
stage('Gradle: build docker image') {
steps {
unstash 'rpm'
sh """
chmod +x gradlew
./gradlew buildDockerImage
"""
}
}
}
}
You could use docker's multi-stage build, but I'm not aware of a nice implementation using Jenkins Pipelines.
We're stashing also several hundreds of megabytes to distribute it to build agents. I've experimented with uploading the artifacts to S3 and downloading them again from there with now visible performance improvement (only that it takes off load from the Jenkins Master).
So my very opinionated recommendation: Keep it like it is and optimize, once you really run into performance / load issues.
you can use Artifactory or any other binary repository manager..
From Artifactory's webpage:
As the first, and only, universal Artifact Repository Manager on the
market, JFrog Artifactory fully supports software packages created by
any language or technology.
...
...Artifactory provides an end-to-end, automated and bullet-proof
solution for tracking artifacts from development to production.

Exporting environment variables from one stage to the next in GitLab CI

Is there a way to export environment variables from one stage to the next in GitLab CI? I'm looking for something similar to the job artifacts feature, only for environment variables instead of files.
Let's say I'm configuring the build in a configure stage and want to store the results as (secret, protected) environment variables for the next stages to use. I could safe the configuration in files and store them as job artifacts but I'm concerned about secrets being made available in files than can be downloaded by everyone.
Since Gitlab 13 you can inherit environment variables like this:
build:
stage: build
script:
- echo "BUILD_VERSION=hello" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy
script:
- echo $BUILD_VERSION # => hello
dependencies:
- build
Note: for GitLab < 13.1 you should enable this first in Gitlab Rails console:
Feature.enable(:ci_dependency_variables)
Although not exactly what you wanted since it uses artifacts:reports:dotenv artifacts, GitLab recommends doing the below in their guide: 'Pass an environment variable to another job':
build:
stage: build
script:
- echo "BUILD_VERSION=hello" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy
script:
- echo "$BUILD_VERSION" # Output is: 'hello'
needs:
- job: build
artifacts: true
I believe using the needs keyword is preferable over the dependencies keyword (as used in hd-deman`'s top answer) since:
When a job uses needs, it no longer downloads all artifacts from previous stages by default, because jobs with needs can start before earlier stages complete. With needs you can only download artifacts from the jobs listed in the needs: configuration.
Furthermore, you could minimise the risk by setting the build's artifacts:expire_in time to be very small.
No this feature is not here yet, but there is already an issue for this topic.
My suggestion would be that you are saving the variables in a files and cache them, as these will be not downloadable and will be removed on finish of the job.
If you want to be 100% sure you can delete it manually. See the clean_up stage.
e.g.
cache:
paths:
- save_file
stages:
- job_name_1
- job_name_2
- clean_up
job_name_1:
script:
- (your_task) >> save_file
job_name_2:
script:
- cat save_file | do_something_with_content
clean_up:
script:
- rm save_file
when: always
You want to use Artefacts for this.
stages:
- job_name_1
- job_name_2
- clean_up
job_name_1:
script:
- (your_task) >> save_file
artifacts:
paths:
- save_file
# Hint: You can set an expiration for them too.
job_name_2:
needs:
- job: job_name_1
artifacts: true
script:
- cat save_file | do_something_with_content

Auto generate build pipeline for gradle build using Jenkinsfile

I am trying to create a build pipeline based upon the Gradle tasks. I have viewed JenkinsFile configuration Pipeline-as-code-demo but I am unable to create a pipeline for gradle tasks. Please suggest me a possible way so that I can use the Jenkinsfile to automatically show the build pipeline just by reading the configurations from the Jenkinsfile.
Thankyou
In case your project uses Gradle Wrapper you can use the following snippet in your Jenkinsfile:
stage('Gradle Build') {
if (isUnix()) {
sh './gradlew clean build'
} else {
bat 'gradlew.bat clean build'
}
}
If you checkout to subdirectory sub-dir you might want to use
stage('Gradle Build') {
if (isUnix()) {
dir('sub-dir') {sh './gradlew clean build'}
} else {
dir('sub-dir') {bat 'gradlew.bat clean build'}
}
}
In case you're using Artifactory to resolve your build dependencies or to deploy your build artifacts, it is recommended to use the Pipeline DSL for Gradle build with Artifactory.
Here's an example taken from the Jenkins Pipeline Examples page:
node {
// Get Artifactory server instance, defined in the Artifactory Plugin administration page.
def server = Artifactory.server "SERVER_ID"
// Create an Artifactory Gradle instance.
def rtGradle = Artifactory.newGradleBuild()
stage 'Clone sources'
git url: 'https://github.com/jfrogdev/project-examples.git'
stage 'Artifactory configuration'
// Tool name from Jenkins configuration
rtGradle.tool = "Gradle-2.4"
// Set Artifactory repositories for dependencies resolution and artifacts deployment.
rtGradle.deployer repo:'ext-release-local', server: server
rtGradle.resolver repo:'remote-repos', server: server
stage 'Gradle build'
def buildInfo = rtGradle.run rootDir: "gradle-examples/4/gradle-example-ci-server/", buildFile: 'build.gradle', tasks: 'clean artifactoryPublish'
stage 'Publish build info'
server.publishBuildInfo buildInfo
}
Otherwise, you can simply run the gradle command with the sh or bat Pipeline steps.
In jenkins you can creates a jenkins pipeline using a script which is written in Jenkinsfile.
We write a script using 'stages' and 'node' as building block, these building blocks allow you to specify instructions that should be executed as part of jenkins pipeline.
To execute gradle build using JenkinsFile first check for Operating system and call appropriate shell that can execute that gradle task, as below:
Jenkinsfile
stage 'build_Project'
node{
if(isUnix()){
sh 'gradle build --info'
}
else{
bat 'gradle build --info'
}
}
Above code snippet create a stage with name build_project and execute gradle build script of the current project.

Resources