How to Re-start Bluemix Secured Gateway Docker Client - docker

I created the Bluemix Secured Gateway Docker Client in my local environment and everything was working fine. But after I rebooted my workstation, the Docker Client does not work anymore. I ran the Docker start command, but nothing happened. What should I do to get it working again?
Thanks.

Please run the "docker run ..." command from the Secure Gateway UI exactly as it's shown in the Secure Gateway UI. We do not support any of the "docker start/stop/restart" commands.

Related

Gitlab runner stucks while pulling docker image

I was trying to run my gitlab-ci in my hosted gitlab server and I picked docker for gitlab-runner executer but in pipline it got stucked and doesn't work.
What should I do to fix this?
Seems the same issue, the Machine on which the docker is running, is sitting behind a proxy server, which is why its getting stuck when its trying to pull the image.
If you are able to login to the machine and check the internet access..
Check if you are using some kind of proxy or not?
Your ID may have SSO to Proxy and hence your ID works .. if the gitlab-runner service runs on a different account, that account may not have internet access

Azure web app for containers- additional parameters for docker run command

I have web application developed using the PHP and Laravel and trying to host in the Azure Web App for Containers service. I have integrated stackify logging functionality for the application and the server.
I need to send additional parameters to the docker run command in the azure web app for containers, need to pass the pid and v in the docker run command.
docker run -it --pid=host -v /usr/local/stackify:/usr/local/stackify
I did not find a way to configure. Please suggest a solution to resolve this. Is there any way configure the docker run commaned in the Azure web app for container service.

VSCode combine remote ssh and remote containers

On my office desktop machine, I'm running a docker container that accesses the GPU. Now, since I'm working from home, I'm connected through ssh to my office desktop computer in vs code via remote ssh plugin which works really well. However, I would like to further connect via remote containers to that running container in order to be able to debug the code I'm running in that container. Failed to get this done yet.
Anyone has any idea if this is possible at all and in case any idea how to get this done?
Install and activate ssh server in a container.
Expose ssh port via docker
create user with home directory and password in container
(install remote ssh extension for vs code and) setup ssh connection within the remote extension in vs code and add config entry:
Host <host>-docker
Hostname your.host.name
User userIdContainer
Port exposedSshPortInContainer
Connect in vs code.
Note: Answer provided by OP on question section.

Enabling Kubernetes on Docker Desktop breaks access to external service

I'm using docker desktop for mac.
I have built a docker image for a Node.js app that connects to an external MongoDB database via URI (the db is running on an AWS instance that I'm connected to over vpn). This works fine - I run the container and the app can connect to the database. Happy days.
Then...
I enable Kubernetes on docker desktop. I apply a deployment.yml to run the container but this deployment fails when trying to connect to the db. From my app's logs (I'm using mongoose):
MongooseServerSelectionError: connect EHOSTUNREACH [MY DB IP] +30005ms
Interestingly...
I can now no longer connect to the db by running my docker container either. I get the same error.
I have to disable kubernetes, restart docker desktop (twice), prune my previous container and network, and re-run my container. Then it will work again.
As soon as I enable kubernetes again, the db becomes unreachable again.
Any ideas why this is and/or how to fix it?
So the issue for us turned out to be an IP range clash. Exactly the same as described in this SO question:
Change Kubernetes docker-for-desktop cluster network ip
Unfortunately, like this user, we haven't been able to find a solution

How to SSH in to different containers in Multi Container Azure App Service

I want to SSH to my containers created in an Azure App Service. These are Linux based containers and used Docker Compose to deploy these to Azure App Service.
I have followed the article to enable SSH. For one of the container (Container A) I am able to SSH (exposed port 2222, 80 for this). But I would like to SSH to other containers (Container B) too. I have exposed another port 2223 for Container B and followed the same steps in that document. When I try to access them using the command ssh root#172.xx.x.x -p 2223, I get the error Connection Refused. But the command ssh root#172.xx.x.x -p 2222 works for the Container A and I am able to see the Dotnet process running for the API in that Container A when I run the Top command
This is still not supported in 2020. For further info you can also check with GitHub repo for App Service.
https://github.com/Azure/app-service-linux-docs
For SSH in multicontainer app service, you can't select a specific container to SSH to. It's not possible as of today. Always the front-facing container is picked up for SSH which is mostly at 80 port also if not then the first container in yaml file.
https://github.com/Azure/app-service-linux-docs/blob/master/how_multicontainer_webapp_determine_web_container.md
https://feedback.azure.com/forums/169385-web-apps/suggestions/34743265-support-ssh-to-specific-container-in-multi-contain
This is the thread from product owner in late 2018. Seems like they still haven't approved that functionality.
How to SSH in to different containers in Multi Container Azure App Service
After further research, I found out that we cannot SSH to multi container. Currently Azure supports SSH to public facing container only. Based on this link its planned by Azure App Service team!

Resources