Start docker container from another application in another docker container - docker

We have an existing java application which exposes a REST API. When it receives a http request, it starts another java process using the Runtime.getRuntime().exec.
We are in the process of migrating this application to docker and we would like to separate these services, the REST application in 1 container and the other component in another container.
Is there any way, that the REST application can start the other application in another docker container?

Yes you can programmatically spawn a docker container.
Docker Remote API will allow you to do that. You can either use http client library to invoke the remote APIs or you can use java docker client libraries to do the same.
Here is the relevant docker documentation:
Remote API:
https://docs.docker.com/engine/reference/api/docker_remote_api/
Libraries: https://docs.docker.com/engine/reference/api/remote_api_client_libraries/

Related

Adding user management for our container via dockerfile

We're using docker for our project. We've a monitoring service (for our native application) which is running on Docker.
Currently there is no user management for this monitoring service. Is there any way we can add user management from Dockerfile?
Please note that I'm not looking docker container user management.
In simple words functionality that I'm looking for is:
Add any user and password in dockerfile.
While accessing external IP, same user and password must be provided to view running monitoring service.
From your few information about your setup, I would say the authentification should be handled by your monitoring service. If this was a kind of webapp you could use a simple basic auth as a first step.
Docker doesn't know what's running in your container, and its networking setup is limited to a simple pass-through between containers or from host ports to containers. It's typical to run programs with many different network protocols in Docker (Web servers, MySQL, PostgreSQL, and Redis all have different wire protocols) and there's no way Docker on its own could inject an authorization step.
Many non-trivial Docker setups include some sort of HTTP reverse proxy container (frequently Nginx, occasionally Apache) that can both serve a Javascript UI's built static files and also route HTTP requests to a backend service. You can add an authentication control there. The mechanics of doing this would be specific to your choice of proxy server.
Consider that any information you include in the Dockerfile or docker build --args options can be easily retrieved by anyone who has your image (looking at its docker history, or docker run a debug shell on it). You may need to inject the credentials at run time using a bind mount if you think they're sensitive and you're not storing your image somewhere with strong protections (if anyone can docker pull it from Docker Hub).

Run new docker container (service) from another container on some command

Does exist any way to do this:
run one service (container) with main application - server (flask application);
server allows to run another services, them are also flask applications;
but I want to run each new service in separate container ?
For example, I have endpoint /services/{id}/run at the server, each id is some service id. Docker image is the same for all services, each service is running on separate port.
I would like something like this:
request to server - <host>//services/<id>/run -> application at server make some magic command/send message to somewhere -> service with id starts in new container.
I know that at least locally I can use docker-in-docker or simply mount docker socket in container and work with docker inside this container. But I would like to find way to work across multiple machines (each service can run on another machine).
For Kubernetes: I know how create and run pods and deployments, but I can't find how to run new container on command from another container. Can I somehow communicate with k8s from container to run new container?
Generally:
can I run new container from another without docker-in-docker and mounting docker socket;
can I do it with/without Kubernetes?.
Thanks for advance.
I've compiled all of the links that were in the comments under the question. I would advise taking a look into them:
Docker:
StackOverflow control Docker from another container.
The link explaining the security considerations is not working but I've managed to get it with the Webarchive: Don't expose the Docker socket (not even to a container)
Exposing dockerd API
Docker Engine Security
Kubernetes:
Access Clusters Using the Kubernetes API
Kubeflow in the spite of machine learning deployments

Docker Rest API to Perform Docker Operations Remotely

I want to connect to a Windows Server Where Docker is Running (Windows Server 2016) and do all the docker operations programmatically using Rest Calls. Can Anyone provide me the Docker Rest API to connect to Windows Server and perform those operations like Docker Container Creation?
Check docker engine api. It gives you some REST endpoints like, GET /containers/json to list all containers.
If you are looking for programmatically doing REST API calls in C#, you can look into Docker.DotNet.
You can use Git Bash for Windows to query something like
curl http://localhost:2375/containers/<your_conatiner_name>/stats
Or, if you want to remotely request docker demon then follow this post

Docker API, remote API, Client API

I would like to know when to use and the difference between Docker API, Docker remote API, Client API and Compose API. TIA.
There is only Docker Engine API, which allows you to manage Docker calling it.
Docker API = Docker Engine API
Docker remote API = I think this means to configure Docker CLI to connect to a remote API to manage container on other hosts.
Client API = Docker CLI. A CLI to use Docker Engine API.
Compose API = This doesn't exist, Compose is only a tool to use Docker Engine API.
For further information, check Docker Engine API docs: https://docs.docker.com/engine/api/
Basically all the categories that you are referring to are Docker Engine APIs
As per the Docker Docs:
The Engine API is the API served by Docker Engine. It allows you to
control every aspect of Docker from within your own applications,
build tools to manage and monitor applications running on Docker, and
even use it to build apps on Docker itself.
It is the API the Docker client uses to communicate with the Engine,
so everything the Docker client can do can be done with the API. For
example:
Running and managing containers Managing Swarm nodes and services
Reading logs and metrics Creating and managing Swarms Pulling and
managing images Managing networks and volumes
These APIs are used to control Docker on the remote servers.
Docker Compose is a tool for defining and running multi-container Docker applications.
These APIs are used to control Docker on the remote servers.
Docker Compose is a tool for defining and running multi-container Docker applications.
Thanks, I was trying to understand the difference between the Docker APIs while working on this Scalable Docker Deployment in the Bluemix platform.

What is the Docker Engine?

When people talk about the 'Docker Engine' do they mean both the Client and the Daemon? Or is it something else entirely?
As I see it there is a Docker Client, a Docker Daemon. The Client runs locally and connects to the Daemon which does the actual running of the containers. The Client can connect to a remote Daemon. Are these both together the Engine? thanks
The Docker Engine is the Docker Daemon running on a single host, installed with the Docker Client CLI. Here are the docs that answer this specific question.
On top of that, you can have a Swarm running that joins multiple hosts to horizontally scale and provide fault tolerance. And there are numerous other projects from Docker, like their Registry, Docker Cloud, and Universal Control Plane, that are each separate from the engine.
Docker engine is a client-server application which comprises of 3 components.
1. Client: Docker CLI or the command line window that helps us to interact.
2. REST API: Client communicate with the server with REST API, the commands issued by the client is sent to the server in the form of REST API, it is this reason our server can either be in the local or remote machine.
3. Server: Server here is either the local or remote machine or host machine which has a daemon process running in it which receives the commands and creates, manages and destroys the docker objects like images, containers, volumes etc.

Resources