Geth runing in docker occurs OOM - docker

I use docker to run geth on Linux:
Client:
Version: 18.06.0-ce
API version: 1.38
Go version: go1.10.3
Git commit: 0ffa825
Built: Wed Jul 18 19:04:39 2018
OS/Arch: linux/amd64
Experimental: false
Server:
Engine:
Version: 18.06.0-ce
API version: 1.38 (minimum version 1.12)
Go version: go1.10.3
Git commit: 0ffa825
Built: Wed Jul 18 19:13:39 2018
OS/Arch: linux/amd64
Experimental: false
Linux:
Linux version 4.4.0-1072-aws (buildd#lcy01-amd64-026) (gcc version 5.4.0 20160609 (Ubuntu 5.4.0-6ubuntu1~16.04.10) ) #82-Ubuntu SMP Fri Nov 2 15:00:21 UTC 2018
geth:
Version: 1.8.20-unstable
Git Commit: f850123ec7500af732b73c2501f1ef36d80ee01d
Architecture: amd64
Protocol Versions: [63 62]
Network Id: 1
Go Version: go1.11.2
Operating System: linux
GOPATH=
GOROOT=/usr/local/go
I use docker-compose to start up it, when it is running for a while, it use almost 50% memory, and it increases utill the process is down? Why this happened? How can I fix this problem?
Thx

Related

Is there a bug in the docker containerwait api when creating a large number of containers?

Is there a bug in the docker containerwait api?The logs show that it stops at the containerwait function, but I found out via docker ps -a that the container has ended and exitcode=0. With 2000+ containers created, there is a 1 in 100 chance that this will happen.When I create a smaller number of containers, this does not happen. Has anyone experienced the same problem?
Can I implement a waitcontainer api myself by polling the container state?Are there any problems with this approach?
docker version:
Client:
Version: 1.13.1
API version: 1.26
Package version: docker-1.13.1-205.git7d71120.el7.centos.x86_64
Go version: go1.10.3
Git commit: 7d71120/1.13.1
Built: Wed Apr 28 13:37:12 2021
OS/Arch: linux/amd64
Server:
Version: 1.13.1
API version: 1.26 (minimum version 1.12)
Package version: docker-1.13.1-205.git7d71120.el7.centos.x86_64
Go version: go1.10.3
Git commit: 7d71120/1.13.1
Built: Wed Apr 28 13:37:12 2021
OS/Arch: linux/amd64
Experimental: false
go sdk version: v1.13.1

Un-responsive docker containers

I see that docker containers in my host show as Running/Up , however when I try to exec , I see .
rpc error: code = 2 desc = containerd: container not found
I don't see any related processes running on ps -aef output.
Looking through the dockerd logs I see -
level=error msg="containerd: get exit status" error="containerd:
process has not exited" id=e4e5d58359 pid=bba1944c4 systemPid=5132
docker version:
Client: Version: 1.13.1 API version: 1.26 Go
version: go1.7.5 Git commit: 092cba3 Built: Wed Feb 8
06:50:14 2017 OS/Arch: linux/amd64
Server: Version: 1.13.1 API version: 1.26 (minimum version
1.12) Go version: go1.7.5 Git commit: 092cba3 Built: Wed Feb 8 06:50:14 2017 OS/Arch: linux/amd64 Experimental: false
What might be causing this behavior ? Pointers?
This issue is fixed since v17.12.
Version 18.03 is the latest supported release so you should do upgrade your docker to latest edition.

Are docker version and docker engine version the same? How to check them separately?

docker version returns:
Client:
Version: 17.12.1-ce
API version: 1.35
Go version: go1.9.4
Git commit: 7390fc6
Built: Tue Feb 27 22:17:40 2018
OS/Arch: linux/amd64
Server
Engine:
Version: 17.12.1-ce
API version: 1.35 (minimum version 1.12)
Go version: go1.9.4
Git commit: 7390fc6
Built: Tue Feb 27 22:16:13 2018
OS/Arch: linux/amd64
Experimental: false
I want to install jupyterhub which requires docker engine 1.12.0, but I doubt that Version 17.12.1 is not the the engine version. How can I get the engine version?
Docker Engine versioning changed from February to March 2017.
The last version in the old format is 1.13.1 (2017-02-08). The first stable version of the community edition in the new format is 17.03.0-ce (2017-03-01).
So, 17.12.1 is newer than 1.12.0 and they both refer to the Docker Engine.
You can check the old versions here: https://docs.docker.com/release-notes/docker-engine/ and the new versions here: https://docs.docker.com/release-notes/docker-ce/.

Docker pulling image is not successful, it stucks

I used docker-compose up to pull the images. I have tried it more than 10 times, but still not yield the successful results. It always freezes at the Downloading process for some particular images (randomly each time).
It's happened like in this image all the time. The output from using docker-compose up commands:
tommiekub#dell3542:~/dev/freelance/sensebook$ docker version
Client:
Version: 18.06.1-ce
API version: 1.38
Go version: go1.11
Git commit: e68fc7a215
Built: Fri Sep 7 11:26:59 2018
OS/Arch: linux/amd64
Experimental: false
Server:
Engine:
Version: 18.06.1-ce
API version: 1.38 (minimum version 1.12)
Go version: go1.11
Git commit: e68fc7a215
Built: Fri Sep 7 11:26:11 2018
OS/Arch: linux/amd64
Experimental: false
How to solve this?

Docker - Inbound network access to host stops when new network is created

Anytime I create a new docker network on the host, I loose inbound network access. If I have an active SSH session it boots me off. The only way to restore it is to delete the network.
[admin#server1 ~]$ docker version
Client:
Version: 17.06.0-ce
API version: 1.30
Go version: go1.8.3
Git commit: 02c1d87
Built: Fri Jun 23 21:20:36 2017
OS/Arch: linux/amd64
Server:
Version: 17.06.0-ce
API version: 1.30 (minimum version 1.12)
Go version: go1.8.3
Git commit: 02c1d87
Built: Fri Jun 23 21:21:56 2017
OS/Arch: linux/amd64
Experimental: false
[admin#server1 ~]$ uname -a
Linux server1 3.10.0-514.26.2.el7.x86_64 #1 SMP Tue Jul 4 15:04:05 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[admin#server1 ~]$ cat /etc/centos-release
CentOS Linux release 7.3.1611 (Core)
Upgraded to latest version as Tarun suggested and it fixed the issue.
[root#server1 ~]# docker --version
Docker version 17.06.2-ce, build cec0b72

Resources