Continuous Delivery tool for Cloud Foundry - jenkins

I'm using Jenkins for Continuous Integration tool with DevOps tools like JIRA, Confluence, Crowd, SonarQube, Hygieia, etc.
But the environments are changed to deploy microservices to PaaS.
So I got the issues to resolve below.
Deployment Monitoring
to view which application is deployed to what instance with which version.
Canary Deployment
deploy to 1 instance and then deploy to all instances(after manual approval or auto).
Deploy to Cloud Foundry
more specifically IBM Bluemix
So I examined Spinnaker but I found that the cloud driver for CF is no longer maintained.
https://github.com/spinnaker/clouddriver/pull/1749
Do you know another open-sourced CD tool?

take a look at concourse : https://concourse-ci.org/
Its open source, you can us it to deploy either application or cloud foundry. It's a central tool for DevOps. Basically you have pipelines that can trigger tasks (manually or automatically). You have some already created ressources (github connector, etc ...) but you can also create your own tasks. Its running docker containers as workers to execute tasks/jobs.
Best,

I find it relatively easy to integrate a CD server to any PaaS provider. You will have to either use a plugin or create your own integration.
My top two recommendations would be gitlab or Bamboo in that order.
Given your preference for Jira, you might prefer Bamboo as it has very good integration with the rest of that Atlassian tools but it is not open source.

Related

Is BitBucket cloud version of source code repo along with Bamboo for CI/CD?

I'm new to Bamboo and currently learning & using the Bamboo as a standalone server in my company. There I can see the much-advanced options like creating the Build Plans, separate deployment projects based on different environments and also can integrate with notifications and triggers.
I wanted to do a lot of research and learning by myself at home so I was looking for a cloud-based version of Bamboo which I can straight away use to perform similar task like creating build plans, etc. I do not see anything cloud version of Bamboo but I can see BitBucket (cloud-based). What I know is that it is a source code repository like GitHub and GitLab and it has integration with inbuilt CI/CD.
Q1. Is BitBucket a cloud version of source code repository plus Bamboo?
Q2. If not, then do we have cloud version of Bamboo with exact options like build plans, deployment projects, etc
Q3. Also, I'm looking if there is any Bot which I can use like SlackBot or DeployBot to invoke or trigger the Bamboo Build Plan with a chat command? Slack I'm familiar but not DeployBot. I can get the Bamboo build notifications to my Slack channel but not the other way around.
I'm learning and doing research & development hence required clarification on my doubts from experts in this DevOps field to show me the right path.
Please suggest as I'm looking for setting up Bamboo with Bot instructing my build plans.
Thank you
Doing hands-on experience in company on Bamboo and learning as much as I can and playing around with it.
Bamboo Cloud was discontinued in January 2017. Bitbucket Cloud can still notify your Bamboo instance via webhook, assuming you configure Bamboo and your firewall and the webhook properly, or you can use Bitbucket Pipelines for the all-in-one approach.
You can also use Bitbucket Server if you'd prefer to keep everything behind the firewall.

Docker/Kubernetes with on premise servers

I have a .NET core web API and Angular 7 app that I need to deploy to multiple client servers, potentially running a plethora of different OS setups.
Dockerising the whole app seems like the best way to handle this, so I can ensure that it all works wherever it goes.
My question is on my understanding of Kubernetes and the distribution of the application. We use Azure Dev Ops for build pipelines, so if I'm correct would it work as follows:
1) Azure Dev Ops builds and deploys the image as a Docker container.
2) Kubernetes could realise there is a new version of the docker image and push this around all of the different client servers?
3) Client specific app settings could be handled by Kubernetes secrets.
Is that a reasonable setup? Have I missed anything? And are there any recommendations on setup/guides I can follow to get started.
Thanks in advance, James
Azure DevOps will perform the CI part of your pipeline. Once it is completed, Azure DevOps will push images to ACR. CD part should be done either directly from Azure DevOps (You may have to install a private agent on your on-prem servers & configure firewall etc) or Kubernetes native CD tools such as Spinnaker or Jenkins-X. Secrets should be kept in Kubernetes secrets.

How do I integrate IBM Integration Bus v9.0.0.8 with Bamboo and BitBucket for CI

Our Enterprise Service Broker team is currently considering moving to the Atlasian Stack as this is a company wide standard and will assist with our Continuous Integration (CI) and continuous Development (CD).
We would like to automate our builds as well as deployments and use Bamboo (Bamboo Agent) to create our artifacts and execute our scrips that we have chosen to write in ANT.
We are currently using Rational Team Concert (RTC - Version control tool) and would like to port to BitBucket so that we can use Bamboo. Is there someone that can guide us in this process, what are the steps we need to take.
I have searched the IBM documentation and they only support bamboo on version 10.2.1410 of the IBM development toolkit which we are not yet making use of as we will not be able to upgrade yet.
Ref: https://docops.ca.com/ca-release-automation/integrations/en/optional-action-packs/ibm-integration-bus-advanced
Are there any best-practices for doing so? Tutorials maybe?
You can connect an RTC to a Git repository. The documentation currently states that it support GitLab and GitHub Enterprise, but nothing about BitBucket.Take a look at this part of the documentation. I think however, you could connect to BitBucket, but jsut treat it like a GitLab repo.
https://www.ibm.com/support/knowledgecenter/SSYMRC_6.0.4/com.ibm.team.connector.cq.doc/topics/c_integ_git.html

What are the pros and cons of using AWS CodePipeline vs Jenkins

What are the pros and cons of using AWS CodePipeline vs Jenkins?
I can't see a whole lot of info on the interwebs (apart from https://stackshare.io/stackups/jenkins-vs-aws-codepipeline). As far as I can see they are as follows:
AWS CodePipeline Pros
Web-based
integrated with AWS
simple to setup (as web-based)
AWS CodePipeline Cons
can't be used to set up code repos locally
Jenkins Pros
standalone software
can be used for many systems (other than AWS)
many options for setup (e.g. plugins)
can be used to setup code repos locally
Any other major differences that people can use to make an informed choice?
CodePipeline is a continuous "deployment" tool, while Jenkins is more of a continuous "integration" tool.
Continuous integration is a DevOps software development practice where developers regularly merge their code changes into a central repository, after which automated builds and tests are run.
With continuous deployment, code changes are automatically built, tested, and released to production. Continuous deployment expands upon continuous integration by deploying all code changes to a testing environment and/or a production environment after the build stage.
References:
https://aws.amazon.com/devops/continuous-integration/
https://aws.amazon.com/devops/continuous-delivery/
Other downside of using AWS CodePipeLine is lack of integration with source control providers other than GitHub. The only other option we have is to create version enabled Amazon S3 bucket and push our code there. This creates an extra layer between Source control and CodePipeline.
Also, there is no proper documentation available to explain how one could push their code to Amazon S3 bucket for codebases built in commonly used platforms such as .Net. The example given in AWS website deals with some random files which is not helpful whatsoever.
The other entry(trivial?) missing in your question from cons section of AWS CodePipeLine is, Price. Jenkins is free.
Gitlab SCM solution is now provided by AWS https://aws.amazon.com/blogs/devops/integrating-git-with-aws-codepipeline/
CodePipeline and Jenkins can accomplish the same thing. Also you don't have to necessarily use the web UI for CodePipeline it can be setup through an AWS SAM CLI template, very similar to CloudFormation templates.
CodePipeline also supports a lot of source code providers, AWS CodeCommit, AWS S3, GitHub, and BitBucket.
I personally like CodePipeline a lot better than Jenkins if your working in AWS. The interface is 10x cleaner IMO. And with the SAM CLI templates your pipelines can be managed as code, similar to how you'd use a Jenkinsfile.
You can do a lot more with Jenkins because you can customize it with a myriad plugins. Thus, you can stay on the bleeding edge, if needed.
By contrast, with Codepipeline you are limited to what AWS offers you. Of course, Codepipeline gives you the opportunity to select Jenkins as the tool for the build step. However, that means you cannot use Jenkins for different purposes at other stages of the pipeline.
If you are fan of Hashicorp Vault, you can easily integrated with Jenkins to provide dynamic secrets to your builds. You cannot do that with Codepipeline. you will have to rely on the Cloud-native mechanisms, in this case AWS KMS.
Here is a tutorial that shows you how to integrated Jenkins with Codepipeline - you will need several plugins to get Jenkins to talk to the different Codepipeline components.
https://aws.amazon.com/blogs/devops/setting-up-a-ci-cd-pipeline-by-integrating-jenkins-with-aws-codebuild-and-aws-codedeploy/

How to manage deployment?

I am new to DevOps, and need to develop a strategy for a growing business that will handle many different services/nodes (like 100).
I've been learning about Docker, and it seems like Docker Cloud is a good service, but I just don't really know the standard use cases of the various services, and how to compare them.
I need some guidance as to how to manage the development environment, deployment, production environment, and server administration. Are Docker Cloud, Chef Cloud, and AWS ECS tools that can help with all of these, or only some aspects? How do these services differ?
If you are only starting out with DevOps I would start with the most basic pipeline and the foundational elements of the pipeline.
The reason why I would start with a basic pipeline is because if you have no experience you have to get it from somewhere and understand the basics of Docker Engine and its foundational elements. In addition, you need to design the pipeline.
Here is one basic uni-container pipeline with which you can start getting some experience:
Maven - use the standard, well-understood versioning scheme in your Dockerfile(s) so your Docker tags will be e.g. 0.0.1-SNAPSHOT or 0.0.1 for a release
Maven - get familiar with and use the spotify plugin
Jenkins - this will do your pulls / pushes to Nexus 3
Nexus 3 - this will proxy both Docker Hub and Maven Central and be your private registry
Deploy Server (test/dev) - Jenkins will scp docker-compose files onto this environment and tear your environments up & down
Cleanup - clean up all your environments with spotify-gc (ideally daily, get Jenkins to do this)
Once you have the above going, then move onto cloud services, orchestration etc - but first get the basics right.

Resources