How to deploy weblogic application as docker container completely using Dockerfile? - docker

I've a simple REST API in the weblogic application. I've to deploy the application as the docker container. But, I'm facing a problem in defining the Dockerfile.
Dockerfile
FROM store/oracle/weblogic:12.2.1.4
COPY target/app.war /u01/oracle
Above is my current Dockerfile. With the current dockerfile, I have to manually deploy the application on the weblogic server. We would like to automate the application deployment using Dockerfile and didn't get the exact examples.
Please advise.

This is a complex task, so it is hard to explain the whole process here.
The high-level steps that you need to execute are the followings:
Start a properly configured WebLogic domain in Docker. This task involves the creation of the admin and managed servers and WL cluster, etc.
Build the application that you wanna deploy
Configure the database properly if you have any
Create the WL resources like connection pool, JMS, etc manually or via WLST script
Deploy your artifact via the WL web console or with WLST script or copy the file under the autodeploy directory
Be careful because the tasks that you executed manually will be lost if you drop your docker container.
You can find concrete examples, use cases, automated scripts that you can use and well prepared, ready for use WebLogic Docker images here: https://github.com/zappee/docker-images
If you have a concrete question, not a general one, like this, then please start a new thread.

Take a look at the GitHub project:
https://github.com/oracle/docker-images/tree/master/OracleWebLogic/dockerfiles

Related

Run a gitlab CI pipeline in Docker container

Absolute beginner in DevOps here. I have a Gitlab repo that I would like to build and run its tests in the Gitlab pipeline CI.
So far, I'm only testing locally on my machine with a specific runner. There's a lot information out there and I'm starting to get lost with what to use and how to use it.
How would I go about creating a container with the tools that I need ? (VS compiler, cmake, git, etc...)
My application contains an SDK that only works on windows, so I'm not sure building on another platform would work at all, so how do I select a windows based container?
How would I use that container in the yml file in gitlab so that I can build my solution and run my tests?
Any specific documentation links or suggestions are welcomed and appreciated.
How would I go about creating a container with the tools that I need ? (VS compiler, cmake, git, etc...)
you can install those tools before the pipeline script runs. I usually do this in before_script.
If there's large-ish packages that need to be installed on every pipeline run, I'd recommend that you make yourown image, with all the required build dependencies, push it to GitLab and then just use it as your job image.
My application contains an SDK that only works on windows, so I'm not sure building on another platform would work at all, so how do I select a windows based container?
If you're using gitlab.com - Windows runners are currently in beta, but available for use.
SaaS runners on Windows are in beta and shouldn’t be used for production workloads.
During this beta period, the shared runner quota for CI/CD minutes applies for groups and projects in the same manner as Linux runners. This may change when the beta period ends, as discussed in this related issue.
If you're self-hosting - setup your own runner on Windows.
How would I use that container in the yml file in gitlab so that I can build my solution and run my tests?
This really depends on:
previous parts (you're using GL.com / self hosted)
how your application is built
what infrastructure you have access to
What I'm trying to say is that I feel like I can't give you a good answer without quite some more information

How to start docker containers using shell commands in Jenkins

I'm trying to start two containers (each with different image) using Jenkins shell commands. I tried installing docker extension in Jenkins and/or setting docker in global configuration tools. I am also doing all this in a pipeline. After executing docker run... I'm getting Docker: not found error in Jenkins console output.
I am also having a hard time finding a guide on the internet that describes exactly what I wish to accomplish. If it is of any importance, I'm trying to start a Selenium Grid and a Selenium Chrome Node and then using maven (that is configured and works correctly) send a test suite on that node.
If u have any experience with something similiar to what I wish to accomplish, please share your thoughts as what the best approach is to this situation.
Cheers.
That's because docker images that you probably create within your pipeline cannot also run (become containers) within the pipeline environment, because that environment isn't designed to also host applications.
You need to find a hosting provider for your docker images (e.g. Azure or GCP). Once you set up the hosting part, you need to add a step to your pipeline to upload/push the image to that provider's docker registry or to the free public Docker Hub. Then, finally, add a step to your pipeline to send a command to your hosting, to download the image from whichever docker registry you chose, and to launch the image into a container (this last part of download and launch is covered by docker run). Only at that point you have a running app.
Good luck.
Somewhat relevant (maybe it'll help you understand how some of those things work):
Command docker build is comparable to the proces of producing an installer package such as MSI.
Docker image is comparable to an installation package (e.g. MSI).
Command docker run is comparable to running an installer package with the goal of installing an app. So, using same analogy, running an MSI installs an app.
Container is comparable to installed application. Just like an app, docker container can run or be in stopped state. This depends on the environment, which I referred to as "hosting" above.
Just like you can build an MSI package on one machine and run it on other machines, you build docker images on one machine (pipeline host, in your case), but you need to host them in environments that support that.

Jenkins jar vs dockerfile

I started learning devops recently. Guys I have a doubt here please solve. After testing Jenkins package everything including code and dependencies into a war/jar file. Even dockerfile contains application's source code and dependencies. Now if we are using the docker container to deploy onto production server. Now where do we use the war file that is generated from JEnkins? someone please clarify.
So lets take Jenkins itself as an example. you can download jenkins.war but still you need a place to host this war file so you can use docker for that. think of it as an alternative for virtual machines.
So the jar file that you have generated through jenkins build needs java or tomcat or whatever external dependencies to be up and running docker can do that for you. Take a look at the following example:
Run a Simple .jar Application in a Docker Container

Docker to deploy java web application with frequent configuration changes in wildfly server

I am new to DevOps. Recently I practiced the Docker examples. I have one usecase in my current project. Here it is
This is java project. It contains one war project that depends on jar project. Build tool is maven.
We are using Jboss Wildfly server.
DataBase is mysql 5.7.
We are using testng framework for the unit test cases.
sonarqube for code analysis. Selenium for testing.
So in any linux box the infrastructure we need is java8, wildfly server and mysql 5.7.
Consider we have 2 boxes. one is dev and another is test. Developer works on his local windows machine.
Sometimes we need some configurations required in jboss folder. Suppose we changed one xml in wildfly configuration folder in this release. How to communicate same to dev and test boxes as local machine. Another case is for dev box we need different configuration in xml and for test box it is different from dev box(SSL information in standalone.xml and user properties). How to handle this?
I would suggest to run all required stack (wildfly and mysql) via docker-compose. Also you can extend official wildfly image to provide your custom configuration so you should build 2 different images, one with tag 'dev' and second with tag 'test'.

Service from sources inside Docker container

Short question: is it ok (aren't there any contradictions with Docker ideology) to compile and start application from sources inside Docker container?
Assume that I have some hypothetical application. Let it be Java web service built with Maven, located somewhere in GitHub. Specifics doesn't matter here.
But before starting this service, I need to set-up several config files with right parameters, known at deployment time. Right now I can build fully-preconfigured application package with a single maven command, passing all the necessary configurations at build command.
Now assume that I need to make it a Docker container and don't have time to refactor it somehow right now. So I have a plan: let my docker image have Maven and Git, ENTRYSCRIPT clones my Git repository, builds and starts the application, passing all the necessary parameters via environment.
Is it suitable plan, or it's just wrong?

Resources