Jenkins call script after build success - docker

I want to Jenkins call a shell script from another docker container.
I don't know Jenkins can do this, and what is the best way to make it.
In one machine, I have this containers:
Service
IP (Docker network)
Note
ERP system
172.74.42.2
Odoo 14 ERP system what work with many plugin. If a plugin got new functions or fixes some bugs, need to restart the system after the pull, then update the plugin.(I write a shell script what run in ssh after push commits.
Jenkins
172.74.42.3
In here I want to call the shell script after the build is success
The Jenkins is work. So the connect between the github and jenkins is good.
I try to write shell commands in Jenkins file. But I think this is a dumb way...

Related

How to run a Jenkins job inside a windows docker container

I am trying to run a jenkins job inside a windows docker container. I have successfully created an image with windows server code docker image which will have MSBuildEngine 4.7.
The problem I am facing is I am not able to run a Jenkins job inside that container.
I am able to do it easily with linux environment.
The actual problem is, Jenkins first puts a shell file which will have the command to run the container and inspect it.
How do I tell Jenkins that my environment is not Linux and it is Windows.
Note: Searching in google does not help now a days. So I directly reached out here
I am working on this issue as well. I am finding that the (maybe just a) underlying issue is how Jenkins tells Docker to mount a volume to the container. I have yet to get around this issue.
edit:
There's a PR addressing this issue and I tested the fork with both Linux and Windows slaves to work as we intend.
Download Rbutcher's fork of the plugin:
git clone https://github.com/rbutcher/docker-workflow-plugin.git
Change to the working branch:
git checkout feat/windows_slaves
Build the plugin:
mvn -DskipTests clean install
Manually import into Jenkins:
Manage Jenkins> Manage Plugins> Advanced>Upload Plugin and select ./target/docker-workflow.hpi.

How to trigger a Jenkins job at boot

When running Jenkins as docker container, some advanced setup may be lost at upgrade (or restart). My typical example is to download wildfly-cli jar into /var/lib/jenkins/war/WEB-INF/lib/ for wildfly-deployer
I find it easy to implement such setup thanks to a Jenkins job.
And I now face the following question: is there a way to trigger that Jenkins job only once after system/jenkins boot ?
I have an idea, which might be somewhat hacky: Build a custom docker container based on the original Jenkins container and add an extra step to your docker file.
That extra step would be triggering that Job. Jenkins does have an option to start job externally, e.g., from a script, or in your case from a docker file.
You can rebuild and restart that container and it will run the build once. Would that work for you?

What's the benefits of docker with Jenkins Pipelines?

I'm new to Jenkins/Docker. So far I've found lots of Jenkins official Documents recommended to be used with Docker. But the necessity and advantages of running Jenkins as a docker container remain vague to me. In my case, it's a node/react app and environment required is not complicated.
Disadvantages I've found running Jenkins as a Docker container:
High usage of hard drive
Directory path in docker container is more complicated to deal with, esp when working with ssh in pipeline scripts
Without docker, I can easily achieve the same and there's also blueocean plugin available.
So, what's the main benefits of Docker with Jenkins/Jenkins Pipeline? Are there pitfalls for my node application using Jenkins without Docker? Articles to help me dive into are also appreciated.
Jenkins as Code
The main advantages of Jenkins in Docker is that it helps you to get: Jenkins as Code
Advantages of Jenkins as code are:
SCM: Code can be in put under version control
History is transparant, backup and roll-back becomes easy.
The code is the documentation of your Jenkins setup.
Jenkins becomes portable, so you can run Jenkins locally to try new plugins etc.
Jenkins pipelines work really well with Docker. As #Ivthillo mentioned: there is no need to install additional tools, you just use images of these tool. Jenkins will download them from internet for you (Docker Hub).
For each stage in the pipeline you can use a different image (i.e. tool). Essentially you get "micro Jenkins agents" which only exists temporary. This makes your Jenkins setup much more clean.
Disadvantage is:
Jenkins initial (Groovy) configuration is poorly documented on the web.
Simple Node setup
Most arguments also holds for a simple Node setup.
Change the node version or run multiple job each with a different Node version becomes easy.
Add your Jenkinsfile inside the Node repo. So everyone with a Jenkins+Docker setup can run your CI/CD.
And finaly: gather knowledge on running your app inside a container will enable your to run your production app in Docker in the future.
Getting started
A while ago I have written an small blog on how to get started with Jenkins and Docker, i.e. create a Jenkins image for development which you can launch and destroy in seconds.

Jenkins pipeline using docker on existing slaves

We have the following jenkins setup:
Jenkins master
Jenkins Slave1
Jenkins Slave2
Jenkins Slave3
Those are all virtual machines and the slaves do always exist. They don't spawn automatically up and down.
Now we have builds which needs a lot of tools (maven, python, aws cli, ...). We can install every tool on every slave and everything will work fine.
But we want to build a docker approach.
Nearly all the tutorials I've seen are using slaves in Docker. They use some orchestration tool like Kubernetes and are creating slaves in Docker, do their stuff and delete the pod again.
We don't have the possibility to do this:
Question: Is it a decent approach to use an 'old' Jenkins setup with
real VM slaves on which we use docker?
What I'm thinking about is writing a pipeline and in each stage we use a docker container:
start build (it will choose a slave, e.g. Slave1)
pipeline will start
stage1: spin up e.g. a python container: git clone and execute python commands. mount volume to workspace??
stage2: sping up e.g. aws container and mount the content of the workspace and execute new commands etc.
Can someone evaluate this approach?
This is a very good approach. In fact the way to do that is documented under jenkins docs under Using multiple containers section.
In each stage you basically spin up a container with the necessary tools available and you can use a volume to presist output from the stage into the workspace so that other
stages can use it.

Can Jenkins be used for Docker Swarm deployment?

How to automate Jenkins for Docker Swarm deployment.
I am wondering if there are any plugins available in Jenkins which will help in Docker Swarm deployment or any other alternative way through which we can achieve the automation of Swarm deployment using Jenkins existing plugins?
Fixed this problem by using a plugin called Publish over SSH
Need to install a Jenkins plugins “Publish over SSH”, this plugin will allows us to
Sends files over SSH(SFTP)
Execute commands on a remote server
First step will be to add remote hosts and second will be to add an execution/build step where the commands will be executed
Follow this link for step by step instruction

Resources