How to enable Yarn for Jenkinsfile Pipeline syntax? - jenkins

Trying to set up a simple unit test runner with Jenkins 2 using a Jenkinsfile and Pipeline declarative syntax. The below sample just about works, but I'd like to use yarn instead of npm.
Jenkinsfile
#!groovy
pipeline {
agent any
tools {nodejs 'node-8.10.0'} // previously configured via Manage Jenkins -> Global Tool Configuration
stages {
stage('Unit') {
steps {
checkout scm
sh 'node -v' // 8.10.0
sh 'npm -v' // 5.6.0
sh 'npm install' // <-- desired change: 'yarn install'
sh 'npm run test:unit' // <-- desired change: 'yarn test:unit'
}
}
}
}
Bonus question: is checkout scm really required? Adding it appears to cause it to run twice.

You can set yarn as a installable dependency in the node tool configuration:
After defining the NodeJS tool, you can declare which global packages you want to install.
You will find it in the Global Tool Configuration menu in the Manage Jenkins section.
Each time your pipeline's is builded, the tool will provide a NodeJS environment with yarn installed.

There is no yarn plugin for Jenkins as far as I know. So there is no yarn tool which can be easily used in a Pipeline and will take care of the yarn installation.
So here are some other possibilities:
You can install yarn locally on Jenkins and use sh 'yarn install' in the pipeline. See https://yarnpkg.com/en/docs/install#alternatives-stable for a list of possible options how to install it. Some steps can easily be scripted in a pipeline like the curl solution.
Or you can install yarn via the npm provided in the pipeline:
sh "npm install -g yarn"
sh "yarn install"
Or if you are using Java and Maven you can use the frontend-maven-plugin
to install yarn via Maven (which has a tool blog in pipelines) and then use the installed yarn by this plugin.
Build inside a docker container the node container has yarn already installed
pipeline {
agent {
docker { image 'node:8.11' }
}
stages {
stage('Test') {
steps {
sh 'yarn install'
}
}
}
}
And like you observed the checkout is redundant. A declarative pipeline will check out the code and the Pipeline script in a special pre step before your steps.

Related

how to build fast api python script on jenkins and fast api should run forever

I have written a fast api. I want to build it CI/CD pipeline. I want to run fast api forever until I manually interrupt it.
I am very new to jenkins. I have written a Jenkins pipeline script which is followed.
pipeline {
agent any
stages {
stage('Build') {
steps {
// Get some code from a GitHub repository
git url: 'https://gitlab.com/something1234/test.git', branch: 'hello'
}
}
stage('deploy'){
steps {
sh """
ls
#!/bin/bash
echo $PATH
echo $HOME
. /home/soleman/anaconda3/bin/activate tensor2
conda activate tensor2
python api.py
"""
}
}
}
}
when I start build this script.The build runs forever instead of success beacuse of python api is running. As shown in figure.
How to build it successfully and deploy it? Please guide.
I don't want to use dockers as I am utilizing system own variables.

What is the equivalent of the command tools in Jenkins scirpted pipeline?

How does Jenkins manage plugins? Do all nodes have a set of plugins installed according to the list specified in the master?
What is the equivalent of declarative pipeline's command tools in scripted pipeline? If there isn't one, how do we use the tools like Maven, NodeJS?
If there isn't one, how do we use the tools like Maven, NodeJS?
According to the node plugin doc at https://plugins.jenkins.io/nodejs/, you could do the following:
nodejs(nodeJSInstallationName: 'Node-name') {
sh 'npm install'
}
Same for the maven plugin:
withMaven(maven: 'Jenkins Maven') {
sh 'mvn install'
}

How to use a Jenkinsfile for these build steps?

I'm learning how to use Jenkins and working on configuring a Jenkins file instead of the build using the Jenkins UI.
The source code management step for building from Bitbucket:
The build step for building a Docker container:
The build is of type multi configuration project:
Reading the Jenkins file documentation at https://www.jenkins.io/doc/book/pipeline/jenkinsfile/index.html and creating a new build using Pipeline :
I'm unsure how to configure the steps I've configured via the UI: Source Code Management & Build. How to convert the config for Docker and Bitbucket that can be used with a Jenkinsfile ?
The SCM will not be changed, regardless if you are using UI configuration or a pipeline, although in theory you can do the git clone from the steps in the pipeline, if you really insist convert the SCM steps in pure pipeline steps.
The pipeline will can have multiple stages, and each of the stages can have different execution environment. You can use the Docker pipeline plug-in, or you can use plain sh to issue the docker commands on the build agent.
Here is small sample from one of my manual build pipelines:
pipeline {
agent none
stages {
stage('Init') {
agent { label 'docker-x86' }
steps {
checkout scm
sh 'docker stop demo-001c || true'
sh 'docker rm demo-001c || true'
}
}
stage('Build Back-end') {
agent { label 'docker-x86' }
steps {
sh 'docker build -t demo-001:latest ./docker'
}
}
stage('Test') {
agent {
docker {
label 'docker-x86'
}
}
steps {
sh 'docker run --name demo-001c demo-001:latest'
sh 'cd test && make test-back-end'
}
}
}
}
You need to create a Pipeline type of a project and specify the SCM configuration in the General tab. In the Pipeline tab, you will have option to select Pipeline script or Pipeline script from SCM. It's always better to start with the Pipeline script while you are building and modifying your workflow. Once it's stabilized, you can add it to the repository.

Jenkins avoid tool installation if it is installed already

Not a jenkins expert here. I have a scripted pipeline where I have tool installed (Node). Unfortunately it was configured to pull in other dependencies which takes 250sec in overall now. I'd like to add a condition to avoid this installation if it(Node with packages) was already installed previously, but don't know where to start. Perhaps jenkins stores meta info from prev runs that can be checked?
node {
env.NODEJS_HOME = "${tool 'Node v8.11.3'}"
env.PATH = "${env.NODEJS_HOME}/bin:${env.PATH}"
env.PATH = "/opt/xs/bin:${env.PATH}"
// ...
}
Are you using dynamic jenkins agents (docker containers)? In this case tools will be installed everytime you run build.
Mount volumes to containers, use persistant agents or build your own docker image with installed nodejs.
As I see you use workaround to install nodejs tool.
Jenkins supports it native way (declarative style):
pipeline {
agent any
tools {
nodejs 'NodeJS_14.5'
}
stages {
stage ('nodejs test') {
steps {
sh 'npm -v'
}
}
}
}
On first run tools will be installed. On next ones - will not since it is installed.

Jenkins Pipeline Across Multiple Docker Images

Using a declarative pipeline in Jenkins, how do I run stages across multiple versions of a docker image. I want to execute the following jenkinsfile on python 2.7, 3.5, and 3.6. Below is a pipeline file for building and testing a python project in a docker container
pipeline {
agent {
docker {
image 'python:2.7.14'
}
}
stages {
stage('Build') {
steps {
sh 'pip install pipenv'
sh 'pipenv install --dev'
}
}
stage('Test') {
steps {
sh 'pipenv run pytest --junitxml=TestResults.xml'
}
}
}
post {
always {
junit 'TestResults.xml'
}
}
}
What is minimal amount of code to make sure the same steps succeed across python 3.5 and 3.6? The hope is that if a test fails, it is evident which version(s) the test fails on.
Or is what I'm asking for not possible for declarative pipelines (eg. scripted pipelines may be what would most elegantly solve this problem)
As a comparison, this is how Travis CI let's you specify runs across different python version.
I had to resort to a scripted pipeline and combine all the stages
def pythons = ["2.7.14", "3.5.4", "3.6.2"]
def steps = pythons.collectEntries {
["python $it": job(it)]
}
parallel steps
def job(version) {
return {
docker.image("python:${version}").inside {
checkout scm
sh 'pip install pipenv'
sh 'pipenv install --dev'
sh 'pipenv run pytest --junitxml=TestResults.xml'
junit 'TestResults.xml'
}
}
}
The resulting pipeline looks like
Ideally we'd be able to break up each job into stages (Setup, Build, Test), but
the UI currently doesn't support this (still not supported).

Resources