How to deploy ml model in docker - docker

I am getting an error regrading deployment in docker
I have created model in flask and I need to deploy in docker.
I need to deploy flask model in docker

Related

CI/CD with Docker Compose on Google Cloud

I'm building a pipeline in Buddy where the application is build on command. New containers are build and pushed to DockerHub when it's time to deploy. I'm using Google's Container Optimized OS to run Docker Compose, this way. But I can't find a way to automatically refresh and pull the latest container from DockerHub within the Contianer Optimized OS on a running instance.
Any idea how this can be achieved?

Can I use the computer power on a different machine for docker?

I use docker locally for development. I run a few containers for Redis, Postgres, the frontend compilation and backend compilation. The frontend and backend map files from my local machine to the docker containers where a process runs that auto compiles. Then I can access the backend server and frontend webserver from services in the docker container hosting them.
My backend can be very resource-intensive as I'm developing a task that processes a large amount of time-series data. It can take about 5-10 mins on my machine. I'm using a 15-inch Macbook pro as my local machine and running docker and my development setup is really pushing my machine to the limits. I'm considering running docker on another Linux PC I have and connecting to it from my MacBook pro.
I use CircleCI quite a bit and they have some setup with docker where the CI containers you run don't actually run docker themselves but are networked out to a separate dedicated machine. The only issue is mapping volumes don't work too great.
How can I set this up in docker so that I can run docker commands locally that run on a separate machine?
Any ideas how I can map the directories to the other machine?
You can use SSH to run commands on another machine:
ssh user#server docker run hello-world
I would recommend against mapping volumes, as that doesn't work well. Instead, I'd simply copy the data you needed to the server.
scp -r directory-to-copy/* user#server:/destination-to-copy-into

how can i deploy WAR files on Tomcat using Oracle and ActiveMQ in Docker?

I want to deploy all my application war files on tomcat container running in docker which is dependent on Oracle 12c Container and ActiveMQ container. What will be good approach?using docker-compose or executing 3 different images?
Anything that can done by running individual docker containers can be done using docker-compose. In fact docker-compose is built for defining and running multi-container Docker applications.

gitlab ci/cd deploy docker to aws ec2

We are developing spring boot application which is currently deploying in AWS manually. For that, first we build docker image through Dockerfile and then connect to AWS EC2 instance from laptop & then pull the image and then we use docker run to start it. But we want to automate the process using gitlab CI/CD.
We created .gitlab-ci.yml, build stage builds spring-boot application and generates jar file. Package stage then build docker images using Dockerfile from source code and then push the image to registry.
Now i don't know how to finish deploy stage. Most of the tutorials explains only about deploying into Google cloud provider. I use below steps to deploy the docker image...
ssh -i "spring-boot.pem" ubuntu#ec2-IP_address.compute-2.amazonaws.com
sudo docker pull username/spring-boot:v1
sudo docker run -d -p 80:8080 username/spring-boot:v1
Can anybody help me to add above steps into deploy stage. Do I need to add pem file into source to connect to ec2 instance.
Or is there any easy way to deploy docker in ec2 from gitlab ci/cd.
First thing, If there is ssh then it's mean you must provide the key or password by default unless you allow access to everyone.
Do I need to add pem file into source to connect to ec2 instance?
Yes, you should provide the key for ssh.
Or is there any easy way to deploy Docker in ec2 from gitlab ci/cd?
Yes, there is the easiest way to do that but for that, you need to use ECS, the specially designed for Docker container and you can manage your deployment through API instead of doing ssh to the ec2 server.
ECS is designed for running Docker container, Some of the big Advantage of ECS over ec2 is you do not need to worry about container management, scalability and availability, ECS will take care of it. provide ECR which is like docker registry but it's private and with in-network.
deploy-docker-containers

Deploying multiple docker containers to AWS ECS

I have created a Docker containers using docker-compose. In my local environment, i am able to bring up my application without any issues.
Now i wanted to deploy all my docker containers to the AWS EC2 (ECS). After going over the ECS documentation, i found out that we can make use of the same docker-compose to deploy in ECS using ECS-CLI. But ECS-CLI is not available for windows instances as of now. So now i am not sure how to use my docker-compose to build all my images using a single command and deploy it to the ECS from an windows instance.
It seems like i have to deploy my docker containers one by one to ECS as like below steps,
From the ECS Control Panel, create a Docker Image Repository.
Connect your local Docker client with your Docker credentials in ECS:
Copy and paste the Docker login command from the previous step. This will log you in for 24 hours
Tag your image locally ready to push to your ECS repository – use the repo URI from the first step
Push the image to your ECS repoository\
create tasks with the web UI, or manually as a JSON file
create a cluster, using the web UI.
Run your task specifying the EC2 cluster to run on
Is there any other way of running the docker containers in ECS ?
Docker-Compose is wrong at this place, when you're using ECS.
You can configure multiple containers within a task definition, as seen here in the CloudFormation docs:
ContainerDefinitions is a property of the AWS::ECS::TaskDefinition resource that describes the configuration of an Amazon EC2 Container Service (Amazon ECS) container
Type: "AWS::ECS::TaskDefinition"
Properties:
Volumes:
- Volume Definition
Family: String
NetworkMode: String
PlacementConstraints:
- TaskDefinitionPlacementConstraint
TaskRoleArn: String
ContainerDefinitions:
- Container Definition
Just list multiple containers there and all will be launched together on the same machine.
I got the same situation as you. One way to resolve this was using the Terraform to deploy our containers as a Task Definitions on AWS ECS.
So, we using the docker-compose.yml to run locally and the terraform is a kind of mirror of our docker-compose on AWS.
Also another option is Kubernetes that you can translate from docker-compose to Kubernetes resources.

Resources