We are searching for a CI and CD Solution for our WebApp based on NodeJS/Meteor.
Our Process should be:
On each Push to Master/ Pull Request/ Merge to Master do the following:
Checkout
Run Code Style Checks (coffeelint, scsslinter, etc.)
Build Code
Run Tests
Generate Tarball-Archive
Deploy archive to Developmet (Quality Management) Server, extract and run
next step would be manual testing of the app on our dev server.
when we think it is deployable, I want have a button in jenkins like "Deploy these Artifacts NOW to Live-Instance". How can I achive this? Also Nice would be something like deploy these artifacts at 2am to the live instances.
From Checkout to deploy to dev-server is already implemented in jenkins. What we need now is a button "deploy this artifact to live"
You need another job to get this working. The other job takes the artifact from the build job and deploy it wherever you want.
There is no possibility to include such behavior in the same job.
Related
I'm exploring the Jenkins world to see if it can fit my needs for this case.
I need to build two git repositories (backend and frontend). For the backend, I would need:
Choose the branch we want to build from a list
Checkout the branch and build Docker image using the Dockerfile
push to ECR
release to a specific Kubernetes deployment
After backend build, we have to build the frontend by doing:
Choose the branch we want to build from a list
Checkout the branch and run npm script to build
deploy to S3 folder
Build of the project should be triggered only manually, by the project owner (who is not a developer )
Is Jenkins the right way to go? And if yes, could you point me to how you would do it?
Thanks
Yes, you can definitely implement what you need with Jenkins. There are different ways to implement each step. But here are some things you can consider using.
For Branch listing, you can consider using a plugin like List Git
Branches Plugin
For Docker image Building and pushing you can use Jenkins Docker Steps.
For K8S stuff you can probably use a Shell script or can use something like Kubecli
For S3 stuff you can use S3 Publisher Plugin.
I want to know how to configure Jenkins with my live preprod instance server ?
Let me to explain you my process and tell me if I'm right and if that's not the good way to do.
1) I have my project project-1 in a server: /var/www/preprod/project-1, this project is in Magento Cms so it contains many files.
2) I copied this project project-1 in a repo Git, repo = project-1.
3) I cloned this project from this repo Git to my local machine: MAMP/htdocs/project-1.
4) I installed Jenkins, and I configured it with git, So when I do some push, Jenkins do a build automatically.
Now what I want to do is after the build, I want Jenkins to upload these changes to my live preprod server, whether automatically and manually.(I want to know the method to do it manually and automatically).
With this method, I develop in my local server, so when I finish some task and it's done, I push it to Git to have the changes history, and after that my need is to push it to the live server.
So tell me please if I'm using the right method, if it's a good practice and what I miss for this continuous deployment & delivery.
You can push it to the server using the Publish over SSH command if your doing a freestyle job https://wiki.jenkins.io/display/JENKINS/Publish+Over+SSH+Plugin, If you are doing an pipeline then you can do a simple scp command...
you can run this after the build is completed and it will run it automatically for you ...
Manually you will be notified when a build is done then you would copy it into your server using the normal way you would do it i.e.copy and paste...
Jenkins is a automation server. The whole point of using jenkins is to automate things so that you "manual" intervention is not required. So automate it where ever possible.
Hope it helps :)
I installed the Deploy Plugin on my Jenkins in order to automate the deployment of my Maven built war packages to Tomcat 7. The problem is that I am able to use the plugin to deploy to a remote Tomcat server only if they are made within the same job that uses the deploy plugin. In other words, I have not been able to set up a standalone job that deploys artifacts made by a different job.
For example, I have a job named pack.foo. It uses the source code in /var/lib/project/module to create module.war and put it in /var/lib/project/module/target. However, because of the Maven version setup, the artifact posted on pack.foo's artifact page is something like module-2.0.0-SNAPSHOT.war.
The only way I am able to deploy module.war is if I add a Post-build Action to pack.foo and specify **/module.war to be a remote Tomcat manager URL (provided I have the manager's credentials in Jenkins config). Then the job's console output logs that /var/lib/project/module/target/module.war was deployed to that URL:
Deploying /var/lib/project/module/target/module.war to container Tomcat 7.x Remote with context
[/var/lib/project/module/target/module.war] is not deployed. Doing a fresh deployment.
Deploying [/var/lib/project/module/target/module.war]
How can I use this, or another plugin, to deploy a WAR artifact that was made in a separate Jenkins job? I would like to have separate jobs for artifact creation and deployment. The plugin wasn't finding **/module-2.0.0-SNAPSHOT.war or even **/module.war built by another job even though there was definitely a file on disk that matched that pattern.
See the paragraph on the Deploy Plugin's page you linked:
How to rollback or redeploy a previous build
There may be several ways to accomplish this, but here is one suggested method:
Install the Copy Artifact Plugin
Create a new job that you will trigger manually only when needed
Configure this job with a build parameter of type "Build selector for Copy Artifact", and a copy artifact build step using "Specified by build parameter" to select the build.
Add a post-build action to deploy the artifact that was copied from the other job
Now when you trigger this job you can enter the build number (or use any other available selector) to select which build to redeploy. Thanks to Helge Taubert for this idea.
I am looking for implementation of CI/CD in to my current project here is what i think will work.
Environment consists of
- Jenkins
- git
- docker
- gradle
- Linux servers
- Sonar
- Ansible.
Each tool will be used as following.
Git:- Developers will push there code to this CVS.
Jenkins:- On detecting Check-in Jenkins will trigger a build and will deploy to one of the server.
Sonar:- will be used for code coverage and will check the code before building the same through Jenkins.
ansible:- ansible will be used to quickly prepare added nodes so that code can be deployed to them.
Docker in case if we need fresh test environments every time we can use docker+ ansible combo for doing the things.
Flow of work will be
User run unit test cases on his machine and commits the code to the server.
Jenkins will pull the code from git and will run sonar on the same and will generate reports.
jenkins will create build and will deploy the same on dev server.
A jenkins job will run and will perform the integration testing on the dev server
Any other automated tests can be run.
Finally builds pushed to next server using Jenkins.
I will use shell commands inside Jenkins to push compiled code from one server to another.
In my this scenario can some one answer me following.
Where will sonar get fit and how to use the same?
I see there are CD tools, cant i push compiled code to the servers using shell scripts written inside the Jenkins jobs to automatically deploy the things? What extra benefits a CD tool provides
Is is wise to create fresh test environment or we can keep using the old one again and again?
Will this complete CI/CD?
can someone share there implementation
You say you plan to use Git. I'll outline a scenario using Git on GitHub
Developers push code changes here as pull requests
The SonarQube GitHub Plugin kicks off an initial analysis of only the code changed in the PR looking for the introduction of new issues (note that coverage and duplications are not included in this check)
Once the PR is merged, Jenkins (in one job or several, depending on your needs)
builds
fires integration tests & any other automated tests
runs SonarQube Scan. Note that this comes last to include integration test results.
pushes build to next server
Note that the ability to break the build when the project doesn't pass the SonarQube Quality Gate you've set up may be desirable in your situation. Unfortunately, it's not available in the current server version: 5.2. It is available in 5.1, and is should return soon.
I am using Jenkins Build Pipeline plug-in for build pipelines. In mine pipeline I have manual step ( Build other project (Manual Step) ). When I trigger build it stops on this manual step (it's ok) and now I want to run it. But I don't want to trigger it via jenkins GUI but via call from some other application.
Example (Jenkins jobs pipeline):
Build -> Deploy Test -> (manual) Deploy Production
now I want second app with big button "Test ok, deploy on production" and via it call the jenkins manual job.
Yes, have a look at this: itĀ“s explaining how to use JenkinsĀ“ Webservice API to call a Build from outside of jenkins:
Calling a jenkins build from outside of jenkins?