What are the pros and cons of using AWS CodePipeline vs Jenkins?
I can't see a whole lot of info on the interwebs (apart from https://stackshare.io/stackups/jenkins-vs-aws-codepipeline). As far as I can see they are as follows:
AWS CodePipeline Pros
Web-based
integrated with AWS
simple to setup (as web-based)
AWS CodePipeline Cons
can't be used to set up code repos locally
Jenkins Pros
standalone software
can be used for many systems (other than AWS)
many options for setup (e.g. plugins)
can be used to setup code repos locally
Any other major differences that people can use to make an informed choice?
CodePipeline is a continuous "deployment" tool, while Jenkins is more of a continuous "integration" tool.
Continuous integration is a DevOps software development practice where developers regularly merge their code changes into a central repository, after which automated builds and tests are run.
With continuous deployment, code changes are automatically built, tested, and released to production. Continuous deployment expands upon continuous integration by deploying all code changes to a testing environment and/or a production environment after the build stage.
References:
https://aws.amazon.com/devops/continuous-integration/
https://aws.amazon.com/devops/continuous-delivery/
Other downside of using AWS CodePipeLine is lack of integration with source control providers other than GitHub. The only other option we have is to create version enabled Amazon S3 bucket and push our code there. This creates an extra layer between Source control and CodePipeline.
Also, there is no proper documentation available to explain how one could push their code to Amazon S3 bucket for codebases built in commonly used platforms such as .Net. The example given in AWS website deals with some random files which is not helpful whatsoever.
The other entry(trivial?) missing in your question from cons section of AWS CodePipeLine is, Price. Jenkins is free.
Gitlab SCM solution is now provided by AWS https://aws.amazon.com/blogs/devops/integrating-git-with-aws-codepipeline/
CodePipeline and Jenkins can accomplish the same thing. Also you don't have to necessarily use the web UI for CodePipeline it can be setup through an AWS SAM CLI template, very similar to CloudFormation templates.
CodePipeline also supports a lot of source code providers, AWS CodeCommit, AWS S3, GitHub, and BitBucket.
I personally like CodePipeline a lot better than Jenkins if your working in AWS. The interface is 10x cleaner IMO. And with the SAM CLI templates your pipelines can be managed as code, similar to how you'd use a Jenkinsfile.
You can do a lot more with Jenkins because you can customize it with a myriad plugins. Thus, you can stay on the bleeding edge, if needed.
By contrast, with Codepipeline you are limited to what AWS offers you. Of course, Codepipeline gives you the opportunity to select Jenkins as the tool for the build step. However, that means you cannot use Jenkins for different purposes at other stages of the pipeline.
If you are fan of Hashicorp Vault, you can easily integrated with Jenkins to provide dynamic secrets to your builds. You cannot do that with Codepipeline. you will have to rely on the Cloud-native mechanisms, in this case AWS KMS.
Here is a tutorial that shows you how to integrated Jenkins with Codepipeline - you will need several plugins to get Jenkins to talk to the different Codepipeline components.
https://aws.amazon.com/blogs/devops/setting-up-a-ci-cd-pipeline-by-integrating-jenkins-with-aws-codebuild-and-aws-codedeploy/
Related
I work for a small startup. We have 3 environments (Production, Development, and Staging) and GitHub is used as VCS.
All env runs on EC2 with docker.
Can someone suggest me a simple CICD solution that can trigger builds automatically after certain branches are merged / manual trigger option?
Like, if anything in merged into dev-merge, build and deploy to development, and the same for staging and pushing the image to ECR and rolling out docker update.
We tried Jenkins but we felt it was over-complicated for our small-scale infra.
GitHub actions are also evaluated (self-hosted runners), but it needs YAMLs to be there in repos.
We are looking for something that can give us option to modify the pipeline or overall flow without code-hosted CICD config. (Like the way Jenkins gives option to either use Jenkins file or configure the job manually via GUI)
Any opinions about Team City?
I'm new to Bamboo and currently learning & using the Bamboo as a standalone server in my company. There I can see the much-advanced options like creating the Build Plans, separate deployment projects based on different environments and also can integrate with notifications and triggers.
I wanted to do a lot of research and learning by myself at home so I was looking for a cloud-based version of Bamboo which I can straight away use to perform similar task like creating build plans, etc. I do not see anything cloud version of Bamboo but I can see BitBucket (cloud-based). What I know is that it is a source code repository like GitHub and GitLab and it has integration with inbuilt CI/CD.
Q1. Is BitBucket a cloud version of source code repository plus Bamboo?
Q2. If not, then do we have cloud version of Bamboo with exact options like build plans, deployment projects, etc
Q3. Also, I'm looking if there is any Bot which I can use like SlackBot or DeployBot to invoke or trigger the Bamboo Build Plan with a chat command? Slack I'm familiar but not DeployBot. I can get the Bamboo build notifications to my Slack channel but not the other way around.
I'm learning and doing research & development hence required clarification on my doubts from experts in this DevOps field to show me the right path.
Please suggest as I'm looking for setting up Bamboo with Bot instructing my build plans.
Thank you
Doing hands-on experience in company on Bamboo and learning as much as I can and playing around with it.
Bamboo Cloud was discontinued in January 2017. Bitbucket Cloud can still notify your Bamboo instance via webhook, assuming you configure Bamboo and your firewall and the webhook properly, or you can use Bitbucket Pipelines for the all-in-one approach.
You can also use Bitbucket Server if you'd prefer to keep everything behind the firewall.
We are developing a CI/CD pipeline leveraging Docker/Kubernetes in AWS. This topic is touched in Kubernetes CI/CD pipeline.
We want to create (and destroy) a new environment for each SCM branch, since a Git pull request until merge.
We will have a Kubernetes cluster available for that.
During prototyping by the dev team, we came up to Kubernetes namespaces. It looks quite suitable: For each branch, we create a namespace ns-<issue-id>.
But that idea was dismissed by dev-ops prototyper, without much explanation, just stating that "we are not doing that because it's complicated due to RBAC". And it's quite hard to get some detailed reasons.
However, for the CI/CD purposes, we need no RBAC - all can run with unlimited privileges and no quotas, we just need a separated network for each environment.
Is using namespaces for such purposes a good idea? I am still not sure after reading Kubernetes docs on namespaces.
If not, is there a better way? Ideally, we would like to avoid using Helm as it a level of complexity we probably don't need.
We're working on an open source project called Jenkins X which is a proposed sub project of the Jenkins foundation aimed at automating CI/CD on Kubernetes using Jenkins and GitOps for promotion.
When you submit a Pull Request we automatically create a Preview Environment which is exactly what you describe - a temporary environment which is used to deploy the pull request for validation, testing & approval before the pull request is approved.
We now use Preview Environments all the time for many reasons and are big fans of them! Each Preview Environment is in a separate namespace so you get all the usual RBAC features from Kubernetes with them.
If you're interested here's a demo of how to automate CI/CD with multiple environments on Kubernetes using GitOps for promotion between environments and Preview Environments on Pull Requests - using Spring Boot and nodejs apps (but we support many languages + frameworks).
I'm using Jenkins for Continuous Integration tool with DevOps tools like JIRA, Confluence, Crowd, SonarQube, Hygieia, etc.
But the environments are changed to deploy microservices to PaaS.
So I got the issues to resolve below.
Deployment Monitoring
to view which application is deployed to what instance with which version.
Canary Deployment
deploy to 1 instance and then deploy to all instances(after manual approval or auto).
Deploy to Cloud Foundry
more specifically IBM Bluemix
So I examined Spinnaker but I found that the cloud driver for CF is no longer maintained.
https://github.com/spinnaker/clouddriver/pull/1749
Do you know another open-sourced CD tool?
take a look at concourse : https://concourse-ci.org/
Its open source, you can us it to deploy either application or cloud foundry. It's a central tool for DevOps. Basically you have pipelines that can trigger tasks (manually or automatically). You have some already created ressources (github connector, etc ...) but you can also create your own tasks. Its running docker containers as workers to execute tasks/jobs.
Best,
I find it relatively easy to integrate a CD server to any PaaS provider. You will have to either use a plugin or create your own integration.
My top two recommendations would be gitlab or Bamboo in that order.
Given your preference for Jira, you might prefer Bamboo as it has very good integration with the rest of that Atlassian tools but it is not open source.
I am new to DevOps, and need to develop a strategy for a growing business that will handle many different services/nodes (like 100).
I've been learning about Docker, and it seems like Docker Cloud is a good service, but I just don't really know the standard use cases of the various services, and how to compare them.
I need some guidance as to how to manage the development environment, deployment, production environment, and server administration. Are Docker Cloud, Chef Cloud, and AWS ECS tools that can help with all of these, or only some aspects? How do these services differ?
If you are only starting out with DevOps I would start with the most basic pipeline and the foundational elements of the pipeline.
The reason why I would start with a basic pipeline is because if you have no experience you have to get it from somewhere and understand the basics of Docker Engine and its foundational elements. In addition, you need to design the pipeline.
Here is one basic uni-container pipeline with which you can start getting some experience:
Maven - use the standard, well-understood versioning scheme in your Dockerfile(s) so your Docker tags will be e.g. 0.0.1-SNAPSHOT or 0.0.1 for a release
Maven - get familiar with and use the spotify plugin
Jenkins - this will do your pulls / pushes to Nexus 3
Nexus 3 - this will proxy both Docker Hub and Maven Central and be your private registry
Deploy Server (test/dev) - Jenkins will scp docker-compose files onto this environment and tear your environments up & down
Cleanup - clean up all your environments with spotify-gc (ideally daily, get Jenkins to do this)
Once you have the above going, then move onto cloud services, orchestration etc - but first get the basics right.