I only have experience to do build with Jenkins, and deployment was via UrbanCode. Now if I want to use Jenkins for release and deployment automation to replace UrbanCode, where shall I start? I see workflow, pipeline, etc., don't know what's the exact solution from Jenkins for the deployment automation.
After spend a day and two, I am still struggling, this time is with so many different plugins. Don't know exactly what are the plugins should be used, e.g. what's the relationship between this: https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin and this:https://wiki.jenkins-ci.org/display/JENKINS/Build+Pipeline+Plugin? I am desperately looking for an well organised document on how to use Pipeline plugin to build a complete build and deployment automation solution.
Related
I'm trying to find Jenkins pipeline feature in UCD where deployment is automated without any click. Usually in UCD I found that(from online resources) people use to chose environment and versions manually. Can anyone help me how I can do that?
UCD is completely different from Jenkins. It does not have Jenkins like pipeline (declarative/scripted) features. UCD is more controlled, environments oriented and secured deployment tool. It doesnt contain beautiful visual pipeline features which Jenkins has.
We are starting to develop CI workflow for our systems in my company.
Currently we just making few basic tasks like build, tests, and upload to Nexus.
The tech stack is a Java project which build in Gradle and Jenkins makes our build.
Currently i'm working with some basic Groovy script to make what we need, but each time i'm copy and paste my updated code to Jenkins and running the job from Jenkins UI to see the results, and to me it seems like not a very good approach for developing such automation code.
My question is, what is the best practice to build and run Jenkins jobs?
Is it possible to run it straight from Intellij ?
Do we need to create a Jenkins project which should be saved as a repository and then deploy it to Jenkins machine?
Do we need to use some Intellij plugins in order to work with Jenkins?
More best practices are welcome :)
Jenkins has an API - so you can do whatever you want!
But in general, for small to medium teams it's better to use Jenkinsfile and let Jenkins pull code changes (or pull-requests) from SCM and trigger builds. You can also configure hooks to trigger builds if your SCM supports this (Github & bitbucket supports this).
If you are eventually pushing your artifacts to a docker image, I would highly recommend docker multi-stage builds.
If you are completely new to CI/CD stuff - Atlassian has a lot of good resources https://www.atlassian.com/continuous-delivery/principles/continuous-integration-vs-delivery-vs-deployment
Currently, I use the Build Pipeline Plugin to orchestrate the delivery of my code through the different environments:
Build the code and execute unit tests
Manually deploy to the development environment
Automatically execute tests on the development environment
Manually release the software and put the version number to the released version.
Manually deploy to the integration test environment by downloading the artefact from a repository, based on the version put by the release build.
Manually deploy to ...
With Jenkins 2.0 comes the Pipeline plugin. But how do these two plugins relate to each other?
Should I migrate to the latest plugin? The things I seem to miss from the Jenkins 2 Pipeline plugin:
Manually trigger a stage. I can wait for an input, but it does not seem to be so elegant
Restart a stage to retrigger a deployment. This does not seem possible.
Visibility into the parameters that were used to trigger a stage, e.g. the version number of the software that was deployed.
Am I missing the point here? Should the two of them be combined? Or how are you approaching a pipeline like this?
With the current state of Jenkins 2 pipelines you are correct to state all the 'missing features' you listed.
One of the advantages of the Jenkins 2 pipeline plugin is that rather than chaining together a series of jobs as with the Build Pipeline Plugin, your entire pipeline is 1 'job', which makes user administration much easier IMO.
The other advantage of Jenkins 2 pipelines would be 'configuration as code', so you can track changes to your pipeline as you would any other file in version control.
Jenkins 2 pipelines are very much the new 'hotness', and there's many plugins implementing compatability day after day.
Once the new UI becomes production ready, I'd imagine that the old build pipeline plugin will begin to be deprecated.
Also you should be aware that the Build Pipeline plugin is not maintained by the Jenkins or CloudBees teams as far as I know, whereas Jenkins 2 pipelines are.
Would I recommend migrating now? No, I personally still don't consider the Jenkins 2 pipelines mature enough for deployment to production in an organisation. I'd stay to stick with what you know for now while you wait for the Jenkins 2 Pipeline ecosystem to mature.
My reasoning I gave in a blog post a few weeks ago (read more here if you want, but I've extracted out the 'weaknesses' here for you):
There are still plenty of plugins that I and many others will consider 'core to their CI pipeline' missing full or partial support for pipelines.
The lack of 'per-project-configuration' in pipelines for many plugins. e.g. Slack - the current implementation 'assumes' that all Jenkins 2 Pipeline projects should be communicated to the same Slack channel/team - whereas you may want to have multiple Slack teams configured. There are multiple other plugins like this.
At present the documentation of Jenkins 2 Pipelines is very limited, though this is improving.
I have an application which needs to be tested for several architectures (Centos5, Centos6, Centos7);
I implemented a Jenkinsfile in which I run the set of tests for the chosen a architecture target.
Now I want to somehow run these tests for all target architectures. How can I achieve this?
I was told that I need to investigate about Jenkins multi-configuration projects, but all the examples I could find are about Java based projects only. And if this is the approach to be used, how can I call my Jenkins script with different input parameter values?
I will appreciate if someone could provide me some hints on where to start and with which Jenkins plugins.
thanks
Now, I want to achieve a matrix based approach, namely build
I have written a Jenkins pipeline which builds an application.
I'm prototyping a new build system using Jenkins, Gradle, and Artifactory. There seems to be conflicting or rather overlapping features in these tools, in regards to specifying the build artifacts and their destination. I see three paths going forward:
Specify the artifact settings on the particular task in Jenkins, using the Jenkins Artifactory plugin.
Specify the artifact settings in the Gradle build scripts, using the Gradle Artifactory plugin.
Specify generic maven repo settings in the Gradle build scripts, using the standard Gradle "maven" plugin.
I see pro's and con's to all of these approaches, but nothing is missing a critical feature for our builds, as far as I can see.
To further my confusion, the Gradle Artifactory plugin wiki states:
Build Server Integration - When running Gradle builds in your
continuous integration build server, it is recommended to use one of
the Artifactory Plugins for Jenkins, TeamCity or Bamboo to configure
resolution and publishing to Artifactory with build-info capturing,
via your build server UI.
So, some questions to get the conversation going:
Does it make sense to clutter the build scripts with artifact logic? It might help to add that developer's don't deploy. Currently, I only see build artifacts being uploaded from the Jenkins task.
Does leaving all of this build logic in the task configuration expose us to issues, in the event that the CI server is down?
What about version control for artifact changes done through the CI interface?
I've seen simple Bamboo configurations that specify the build artifacts through the CI server UI, rather than the pom's. Is this just a bad build practice?
Is there a killer tool integration feature that separates one of these approaches from the other?
How useful is the build info object? Is that only available in the Jenkins Artifactory plugin and not the Gradle Artifactory plugin?
I am really hoping to hear from existing users of these tools and what pitfalls/requirements may have led them to one of the approaches above (or perhaps even a better one that I haven't considered yet).
Does it make sense to clutter the build scripts with artifact logic? It might help to add that developer's don't deploy. Currently, I only see build artifacts being uploaded from the Jenkins task.
I'd say that's the way to go. Your build server is the single point of truth, and only artifacts built in the build server should be deployed.
Does leaving all of this build logic in the task configuration expose us to issues, in the event that the CI server is down?
That one is simple - you shouldn't deploy while your CI server is down. Building on local machine might produce wrong artifacts, which shouldn't be deployed.
What about version control for artifact changes done through the CI interface?
Not sure I understood your question.
I've seen simple Bamboo configurations that specify the build artifacts through the CI server UI, rather than the pom's. Is this just a bad build practice?
This configuration ignores Maven's ability to deploy, and I am not sure I can find a good scenario to justify it. The only thing I can think of is deferred deploy, but Artifactory plugin can take care of that.
Is there a killer tool integration feature that separates one of these approaches from the other?
Now we got to the essence :)
Well, the advantage of defining what you deploy in your build script (in case of Gradle) gives you the flexibility to fine-tuning every aspect of the deployment (think about the dynamic properties you might want to add in certain cases). Another very serious advantage is that your build is source, which means it is versionable in your version control.
The advantage of defining the deployment details in the build server configuration is that the build server is the only place the deployment should occur. So, if you don't have the deployment details in your build script, you know for sure it won't be deployed standalone.
So, how can you combine between the two to get the advantages of both worlds?
Code your deployment logic in your Gradle script using the Artifactory plugin DSL. Provide details like username and password from properties, which exists on build server only.
How useful is the build info object?
Extremely useful. The information in buildInfo was harvested during the build process and the buidInfo is the only place it exists. Having this information is the only option you will be able to reproduce this build in the future.
Is that only available in the Jenkins Artifactory plugin and not the Gradle Artifactory plugin?
'artifactory' and 'artifactory-publish' Gradle plugins both generate the buildInfo object, regardless of where are they running (be it your local machine or Jenkins build server).