How to deploy war file to aws ec2 instances placed in auto scaling groups - jenkins

How to copy .war file of tomcat 7 and 8 to EC2 instances placed in auto scaling group without any downtime. I want to replace the existing .war file with latest one in each and every servers in the target group/scaling group. How to configure my Jenkins for this.
Thank you

Once they are in an auto-scaling group, you will not be able to modify files directly on the server. Changing something on one server will not be mirrored on the other servers, and even if you changed all the currently running servers manually, those changes will not be rendered by any servers implemented by auto-scaling actions.
This can be solved by many methods, such as using "AWS Code Deploy."
You may also be able to configure something in your auto-scaling configuration via an "EC2 User-Data script" that will run on each server when it is created. That script could check out Git's latest code, or pull S3's latest build artifact and then launch the app.
You will simply mark the current instances as "unhealthy" when you have an upgrade ready to deploy and wait for the "Auto-Scaling" community to replace them with new, modified instances automatically.

Related

How to deploy weblogic application as docker container completely using Dockerfile?

I've a simple REST API in the weblogic application. I've to deploy the application as the docker container. But, I'm facing a problem in defining the Dockerfile.
Dockerfile
FROM store/oracle/weblogic:12.2.1.4
COPY target/app.war /u01/oracle
Above is my current Dockerfile. With the current dockerfile, I have to manually deploy the application on the weblogic server. We would like to automate the application deployment using Dockerfile and didn't get the exact examples.
Please advise.
This is a complex task, so it is hard to explain the whole process here.
The high-level steps that you need to execute are the followings:
Start a properly configured WebLogic domain in Docker. This task involves the creation of the admin and managed servers and WL cluster, etc.
Build the application that you wanna deploy
Configure the database properly if you have any
Create the WL resources like connection pool, JMS, etc manually or via WLST script
Deploy your artifact via the WL web console or with WLST script or copy the file under the autodeploy directory
Be careful because the tasks that you executed manually will be lost if you drop your docker container.
You can find concrete examples, use cases, automated scripts that you can use and well prepared, ready for use WebLogic Docker images here: https://github.com/zappee/docker-images
If you have a concrete question, not a general one, like this, then please start a new thread.
Take a look at the GitHub project:
https://github.com/oracle/docker-images/tree/master/OracleWebLogic/dockerfiles

Why don't my migrated Jenkins jobs display after migration from Windows to Linux?

I recently migrated our windows instance of Jenkins to Linux. This was very easy as I just copied everything from %JENKINS_HOME% to the linux box.
However someone let me know they had their own personal instance of Jenkins that they wanted to rollover to the Linux VM. I copied the jobs folder from their Windows box, to the Linux VM but they dont show up in any of my views (including the ALL view). I also installed the Job Import plugin and it ran successfully as well but I have the same issue.
I'm not sure if I have to modify my config.xml to include these views or if there is another file that I need to merge with my Linux VM. I have also restarted the service and reloaded configuration from disk.
This is on Jenkins 2.0.
Does anyone have any thoughts on what I may have missed?
Based on the comment of TheEllis:
The copied job files must have the same owner as the jenkins instance runs under. Furthermore read/write/execute permission for the owner should be set, too.

Deploy apps from release server

I don't like when it comes to release my projects on production server.. May be i just don't have enough experience, nobody taught me how to do this in a right way.
For now i have several repos with scala (on top of spray). I have everything to build and run this projects on my local machine (of course, i develop them). So installed jenkins on my production server in order to sync from git, build and run. It works for now but i don't like it, because i need to install jenkins on every machine i want to have run my projects. What if i want to show my project to my friend in cafe?
So i've come with idea: what if i run tests before building app, make portable build (e.q. with sbt native packager) and save it on remote server "release server". That server just keeps these ready to be launched apps.
Then i go to production server, run bash script that downloads executables from release server and runs my project on a machine
In future i want to:
download and run projects inside docker containers.
keep ready to be served static files for frontend. Run docker
container with nginx and linked volume with static files
I heard about nexus (http://www.sonatype.org/nexus/), that artist use to save their songs, images, so on. I believe there should be open source projects that expose idea like mine
Any help is appreciated!
A common anti-pattern, in my opinion, is to build the software every time you perform a deployment.You are best advised to separate the process of build from the act of deployment by introducing a binary repository manager (you've mentioned on such example, nexus).
Best Practice - Using a Repository Manager
Binary repository manager
How can I automatically deploy a war from Nexus to Tomcat?
Only successfully tests builds get pushed to the repository, so you can treat each successful build as a mini-release. A by-product of this is that your production server does not have to have all the build software pre-installed (like, Jenkins, ANT , Maven, etc).
It should be noted that modern repository managers like Nexus and Artifactory now support Docker registries too, so that you use these for deploying docker images too.
Update
A related chef question, a technology where there is no intermediate binary file (like a jar). In this case the software is still "released" by creating a tar distribution stored in the repo.
chef cookbook delivery - chef server vs. artifactory + berkshelf

How to deploy java application(war) to EC2 using jenkins?

I am setting up develop environment for java project.
And my team decide to use Jenkins for CI, and AWS EC2(linux) for server.
I succeeded to make an war file by jenkins job.
But, I can't find a way how to copy war file to EC2, and restart tomcat server on EC2.
I googled about it using "jenkins ec2 deploy", but in fail.
somebody help me!
Step 1. Install Jenkins plugin
Open your favorite browser and navigate to Jenkins. Log in and select “Manage Jenkins” followed by “Manage Plugins”. Select the “Available” tab, locate the “Deploy to container” plugin and install it.
Step 2. Edit tomcat-users.xml
In order for Tomcat to accept remote deployments, you have to add a user with the role manager-script. To do so, edit the file ../conf/tomcat-users.xml and add the following line:
<user username="deployer" password="deployer" roles="manager-script" />
Step 3. Edit the Jenkins job
Back in Jenkins, go to your job and select “Configure”. Next, scroll down to the bottom of the page to the “Post-build Actions”. Select the option “Deploy war/ear to a container” from the “Add post-build action” dropdown button. Fill in the new fields.
Step 4. Run the Job project and verify the end results
Schedule a build for your job in Jenkins. If you check out the log file you should see one or more lines near the end indicating that the war file has been deployed.
If you check the logfiles in Tomcat (catalina.out) you should also see that your application has been succesfully deployed.
Lastly, if you point your browser to the URL and context path you’ve specified in the job configuration in Jenkins (e.g., http://your-server:8080/mywebapp), you should be able to open your freshly deployed application.
Credits to Jdev.it
More info can be found here
With EC2 (or any other deployment practice too), first determine your production servers are going to be mutable or immutable.
[Mutable]
The servers will be running forever, and you perform on-going updates as explained in the blogpost mentioned above (elizabetht) for Java war, or many other ways for different languages/platform.
[Immutable]
The servers are re-created (vs. upgraded) by automation mechanism such as scripting, or using config. mgmt tools like Puppet/Chef/Ansible or vendor specific initialization mechanism like AWS Userdata/Docker dockerfile/Vagrant vagrantfile, or using many other provisioning tools.
Generally speaking databases or queues should be Mutable category, and all other compute nodes are better be Immutable category. The benefits of Immutable category are a lot including easy HA, disaster recovery and also enables Blue/Green deployment and much more.

How to deploy the latest version of the site using Chef and tar_extract

I am using jenkins to continuously build the website front-end code from github repository, package it up into the tar archive and post the it to the S3 bucket. The Jenkins build creates files named like this FrontEnd-122.tgz where 122 is the build number.
I am using the following recipe to deploy the app onto the server:
deploy_version = node['my-app']['build-number']
deploy_from = "http://mybucket.com/FrontEnd-#{deploy_version}.tgz"
tar_extract "#{deploy_from}" do
target_dir '/usr/local/site/FrontEnd'
creates '/usr/local/site/FrontEnd/index.php'
tar_flags [ '--strip-components 1' ]
end
This all works great, however I have to manually update the node attribute my-app/build-number. Which is fine for QA and production deployments.
What I would like to do is to have a snapshot deployment VM, where I the latest code gets deployed, for further testing with selenium and friends. However, to do that I need to
have a way for the above cookbook to figure out what is the latest build number is and deploy from there. Do you have any suggestions?
Tricky one because you need a mechanism for chef to determine the latest revision stored in S3.
Presumably you store the code in a revision control system like subversion or git? Would it be feasible to use the chef deploy resource instead? Let chef pull the website code from your trunk or master branch, for testing purposes.
Another option would be to use a binary repository manager that understands the concept of "Snapshots". Take a look at products like Nexus, Artifactory and Archiva. You could then use S3 for both backup and a distribution area for approved and released copies of your site.
So, I used the dumb way to solve this issue. Besides putting the versioned archive to the S3 bucket. I also push the same archive using the name like 'FrontEnd-Latest'. Also I modified the cookbooks to use a version parameter. The staging server has version parameter set to 'Latest' and the production server has the parameter set to whatever version is considered to be stable.

Resources