Fail to download maven in Devcontainer - docker

I'm currently working on a spring boot app which communicates with PostgresSQL.
Everything is running in a Devcontainer on Docker.
Until now, the problem I had was that I was behind a corporate proxy and Docker couldn't download images. To solve that I used another connection to work.
But switching all the time is not something I appreciate.
Yesterday I switched to Docker pro and changed proxy settings for docker.
Now this part works, when Docker wants to download images, it's ok.
The issue now is that SDK which try to download Maven in my docker file met problem to connect to internet and therefore does not success in downloading Maven.
I guess SDK needs information regarding proxy port or something ?
But how can I provide that ?
Here is the error when I try rebuilding Devcontainer
enter image description here
Here is my Dockerfile :
enter image description here

Related

Docker hub connected via CLI and Docker desktop... #matterlabs/hardhat-zksync-solc plugin still fails to connect to docker hub on M1 Mac Air 2020

I can only reproduce this error on my Mac Air. I have a Mac tower from 2010 that I've opencore'd to 12+ and it does not have this issue.
For the life of me, I cannot get through this error with the #matterlabs/hardhat-zksync-solc plugin. As you can see in the window behind my CLI, I am connected to docker hub through Docker Desktop. I can also log into docker via the CLI using docker login. I've already tried logging in and out using various methods.
My last suspicion is that maybe a port is natively blocked? Where would I begin trying to troubleshoot this?
Just answering this to mention that support for Docker has been deprecated and it's recommended that users use binary as compiler source to compile contracts with zksolc.
You can find more info on how to compile contracts on zkSync here: https://v2-docs.zksync.io/dev/developer-guides/contracts/contracts.html
And specific information about the zksolc harhdat plugin here: https://v2-docs.zksync.io/api/hardhat/plugins.html#hardhat-zksync-solc

Cannot pull the project from Bitbucket (the project is with IP restrictions) while using Docker with WSL2 Ubuntu-20.04 Distro

I've a Symfony project that I am running on my PC with Symfony serve.
This project is on Bitbucket that has IP restrictions, I can only work from home and nowhere else for security reasons, and all works just fine :).
I wanted to create a Docker image so that I can easily change my machine and be able to deploy it elsewhere.
So I created a Docker image and did the necesseray configurations and all seems good, I can open the project and work the same way as before. The Docker has the default WSL (WSL1) and I've noticed that the application isn't running as fast as usual (outside the Docker, to load a page it would take 3 seconds, while with Docker it takes at least 30 seconds).
I did some research and found out that I could use the WSL2 with DOCKER which provides better performance than the legacy Hyper-V Backend and Enabled integration for the distro UBUNTU-20.4. The problem using the WSL2 is that I am no longer able to pull my project in the WSL2 (from the Ubuntu-20.0) because of the IP restrictions.
It is really strange that I cannot find any configuration for this and I have no idea what should I do to change it. If I pull the project outside the WLS2 distro it works, with the default WSL it works also but not with the WSL2.
I removed the IP restrcitions and the Docker image worked fine, I have the same speed as If I was outside the Docker. The only problem is that I cannot use the IP restrcitoins for this !
Does anyone know how to fix this ? I haven't been able to find any documentation for this issue.
I am using Windows 10 and the Docker version : 4.5.1 (74721)
Thanks a lot for any information.

VS Code "Attach Visual Studio Code" to remote container error

I am trying to develop in a remote container.
I run VS Code on my local windows machine.
I have a linux machine which runs docker and a bunch of containers.
I have the "Remote - Containers" and "Remote - SSH" extensions installed in VS Code.
I can connect to my linux machine in VS Code and I can see the running containers.
I can right click on a container and choose "Attach Shell". This works fine:
When I right click on a container and choose "Attach Visual Studio Code" I get an error:
UPDATE
The above error was raised because (for some reason?) docker must be running locally on windows also even though we are fully on a remote machine. I've installed and run docker locally.
Now when I right click on a running container, I get a different error:
Of course the containers are running -- I see them.
How can I Attach Visual Studio Code to a running remote container successfully?
This may not be a real answer but it's too much for a comment.
I believe you have a local machine and docker on a remote server.
The first thing you have to do is to install docker on your local machine and configure it so that's its looking for the docker host on your remote server.
Then you can create a .devcontainer.json on your machine. If you have the extension installed, VSCode will offer you do open this as container environment. Since your docker host sits on remote, this will now happen on your server instead of your local machine.
When I did the setup, I followed amongst other things this guide. Especially the SSH-Agent was required to get a remote docker host working. https://code.visualstudio.com/docs/remote/containers-advanced#_a-basic-remote-example
Here is a example .devcontainer file of mine.
Now back to your initial question, I don't think you will be able to use the remote container extension on a container that wasn't started as dev container. This is because vscode will install a bunch of stuff in there when its first set up. Similar to the SSH Extension. I may be wrong on this so take it with a grain of salt.
It may also be worth noting that once you connect to your server via SSH and have then the regular docker extension (which is not the remote container extension) installed, on remote, you will see your docker images listed there. But that does not mean you will be able to connect like that from local to remote container. For that you need to configure a docker remote host.
I have also faced similar issue after doing some research I found the issue was with my installation.
But I faced this issue when I installed vs-code through snap in Ubuntu.
May be try uninstalling VS Code and reinstalling it.
It should work if Docker is installed properly.

Unable to download the Jenkins plugins running on Google Cloud Platform

I'm running the Jenkins as a Docker container on a Virtual Machine on Google Cloud Platform. On the very first screen of setup, I can see that a lot of plugins did not install in my Jenkins server?
Please let me know how to resolve this issue? Is it something due to with the security on the cloud by default which restricts downloading of plugins?
Refer following link for screenshot:-
https://storage.googleapis.com/mydockerissues/Jenkins%20Plugins%20Issue.PNG
Cheers
Something similar happened to me when running Jenkins on Docker on my local machine. To get everything to install I had to keep retrying. It took several retries but eventually I got everything installed.
I'm not sure why this is the case. Maybe it fails downloads whose dependencies aren't installed yet?

Can i containerise console app with vs2019 into ACR, seems only dotnetcore web apps will work

I am using VS2019 with docker support (and windows docker running). When creating a console app there is no checkbox for docker support like there is for a dotnetcore web app. This means i cannot publish a container to ACR from VS2019.
I can create a console app and then add docker support and i can build image but embarrassing enough i cannot locate the image that's built or figure how to get it into ACR. running docker ps -a shows nothing in the container list.
So: is there support for pushing containers to ACR from VS2019 console apps?
If not then exactly how do i build a docker image of a console app and get it to ACR - am i left with CLI only?
thanks
Paul
I managed to solve this.
The issue was that the underlying version of windows 10 i was on would not allow me to create a docker image using a more modern supported version of windows nano server. Once i updated windows and double checked the base image that create image was using it would all work. Take the resulting docker image and use the azure cli to push to ACR and then i can run the container from ACI...
Bit tortuous compared to creating a web app but it worked...

Resources