Docker: error using docker-compose up on the official getting started tutorial - docker

I'm new to docker and I'm trying to follow this simple "getting started" tutorial https://docs.docker.com/compose/gettingstarted/ using a newly first time installation of docker (for Windows 10) dowloaded from here: https://hub.docker.com/editions/community/docker-ce-desktop-windows/.
At step 4 of this tutorial i get this error:
PS D:\composetest> docker-compose up
Building web
Traceback (most recent call last):
File "site-packages\docker\credentials\store.py", line 80, in _execute
File "subprocess.py", line 395, in check_output
File "subprocess.py", line 487, in run
subprocess.CalledProcessError: Command '['C:\\Program Files\\Docker\\Docker\\resources\\bin\\docker-credential-desktop.EXE', 'list']' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose\cli\main.py", line 72, in main
File "compose\cli\main.py", line 128, in perform_command
File "compose\cli\main.py", line 1078, in up
File "compose\cli\main.py", line 1074, in up
File "compose\project.py", line 548, in up
File "compose\service.py", line 367, in ensure_image_exists
File "compose\service.py", line 1106, in build
File "site-packages\docker\api\build.py", line 261, in build
File "site-packages\docker\api\build.py", line 308, in _set_auth_headers
File "site-packages\docker\auth.py", line 302, in get_all_credentials
File "site-packages\docker\credentials\store.py", line 71, in list
File "site-packages\docker\credentials\store.py", line 93, in _execute
docker.credentials.errors.StoreError: Credentials store docker-credential-desktop exited with "error listing credentials - err: exit status 1, out: `Impossibile trovare elemento.`".
[13284] Failed to execute script docker-compose
EDIT: The accepted solution in docker-compose unable to start, unfortunately, did not work.
What is going wrong?

I have the same issue...
Try:
$nano ~/.docker/config.json
In this file change credsStore to credStore
Now run your docker-compose. If its not works try sudo

try to add to the environment path :
C:\Program Files\Docker\Docker\resources\bin
If you are using WSL2 run this command
sudo ln -s /mnt/c/Program\ Files/Docker/Docker/resources/bin/docker-credential-desktop.exe /usr/bin/docker-credential-desktop.exe

Related

airflow worker crashing in helms upgrade with Temporary failure in name resolution for postgres

I have been trying to use a custom dockerfile for mounting dags and plugins as follows:
FROM apache/airflow:2.3.0-python3.7
COPY ./dags/ /opt/airflow/dags/
COPY ./plugins/ /opt/airflow/plugins/docker push
COPY requirements.txt .
RUN pip install -r requirements.txt
EXPOSE 5555
which I am building as:
docker build -f base.dockerfile --pull --tag lqc-airflow:0.0.1 .
minikube image load lqc-airflow:0.0.1
and then doing a helm install
helm upgrade $RELEASE_NAME apache-airflow/airflow --namespace $NAMESPACE --set images.airflow.repository=lqc-airflow --set images.airflow.tag=0.0.1
which however is making just the airflow-worker-0 pod fail due to the following error:
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/__main__.py", line 38, in main
args.func(args)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 51, in command
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/cli.py", line 99, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/commands/celery_command.py", line 130, in worker
session = celery_app.backend.ResultSession()
File "/home/airflow/.local/lib/python3.7/site-packages/celery/backends/database/__init__.py", line 109, in ResultSession
**self.engine_options)
File "/home/airflow/.local/lib/python3.7/site-packages/celery/backends/database/session.py", line 88, in session_factory
self.prepare_models(engine)
File "/home/airflow/.local/lib/python3.7/site-packages/celery/backends/database/session.py", line 72, in prepare_models
ResultModelBase.metadata.create_all(engine)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/sql/schema.py", line 4745, in create_all
ddl.SchemaGenerator, self, checkfirst=checkfirst, tables=tables
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3007, in _run_ddl_visitor
with self.begin() as conn:
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 2923, in begin
conn = self.connect(close_with_result=close_with_result)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3095, in connect
return self._connection_cls(self, close_with_result=close_with_result)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 91, in __init__
else engine.raw_connection()
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3174, in raw_connection
return self._wrap_pool_connect(self.pool.connect, _connection)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3145, in _wrap_pool_connect
e, dialect, self
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 2004, in _handle_dbapi_exception_noconnection
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 211, in raise_
raise exception
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3141, in _wrap_pool_connect
return fn()
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 301, in connect
return _ConnectionFairy._checkout(self)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 755, in _checkout
fairy = _ConnectionRecord.checkout(pool)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 419, in checkout
rec = pool._do_get()
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/impl.py", line 259, in _do_get
return self._create_connection()
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 247, in _create_connection
return _ConnectionRecord(self)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 362, in __init__
self.__connect(first_connect_check=True)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 605, in __connect
pool.logger.debug("Error on connect(): %s", e)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 72, in __exit__
with_traceback=exc_tb,
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 211, in raise_
raise exception
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 599, in __connect
connection = pool._invoke_creator(self)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/create.py", line 578, in connect
return dialect.connect(*cargs, **cparams)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 583, in connect
return self.dbapi.connect(*cargs, **cparams)
File "/home/airflow/.local/lib/python3.7/site-packages/psycopg2/__init__.py", line 122, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not translate host name "postgres" to address: Temporary failure in name resolution
I am just following the reading advisory from airflow: https://airflow.apache.org/docs/helm-chart/stable/manage-dags-files.html
please note that there are no such name resolution errors if I dont use my custom docker file. Kindly help!
thanks #Hussein also for your support, but I could solve it by myself. I saw that the logs of the airflow-migrations pods, complaint of a revision id not being found. So somehow I chanced upon: https://airflow.apache.org/docs/apache-airflow/stable/migrations-ref.html
and there I saw the airflow tag which was above/carrying that alembic revision.
My revision id was ecb43d2a1842 and thus the changes to my docker file were:
FROM apache/airflow:2.4.3
COPY ./dags/ /opt/airflow/dags/
COPY ./plugins/ /opt/airflow/plugins/
COPY requirements.txt .
RUN pip install -r requirements.txt
thus 2.4.3 was the catch.

Using google cloud registry with docker while offline

I'm using Google Cloud Registry, associating with Docker using gcloud auth configure-docker.
https://cloud.google.com/sdk/gcloud/reference/auth/configure-docker
However, when my computer is offline and I run docker-compose up I get an error where it tries to communicate/authenticate with Google.
how can I use docker offline now that I've started using GCR?
$ docker-compose up --build --force-recreate -d
Building solr
ERROR: (gcloud.auth.docker-helper) There was a problem refreshing your current auth tokens: Unable to find the server at www.googleapis.com
Please run:
$ gcloud auth login
to obtain new credentials, or if you have already logged in with a
different account:
$ gcloud config set account ACCOUNT
to select an already authenticated account to use.
Traceback (most recent call last):
File "site-packages/dockerpycreds/store.py", line 74, in _execute
File "subprocess.py", line 336, in check_output
File "subprocess.py", line 418, in run
subprocess.CalledProcessError: Command '['/Users/me/google-cloud-sdk/bin/docker-credential-gcloud', 'get']' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "site-packages/docker/auth.py", line 129, in _resolve_authconfig_credstore
File "site-packages/dockerpycreds/store.py", line 35, in get
File "site-packages/dockerpycreds/store.py", line 87, in _execute
dockerpycreds.errors.StoreError: Credentials store docker-credential-gcloud exited with "".
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 127, in perform_command
File "compose/cli/main.py", line 1080, in up
File "compose/cli/main.py", line 1076, in up
File "compose/project.py", line 475, in up
File "compose/service.py", line 342, in ensure_image_exists
File "compose/service.py", line 1082, in build
File "site-packages/docker/api/build.py", line 251, in build
File "site-packages/docker/api/build.py", line 307, in _set_auth_headers
File "site-packages/docker/auth.py", line 96, in resolve_authconfig
File "site-packages/docker/auth.py", line 146, in _resolve_authconfig_credstore
docker.errors.DockerException: Credentials store error: StoreError('Credentials store docker-credential-gcloud exited with "".',)
[26419] Failed to execute script docker-compose
Docker-compose will read your YAML file to configure your application’s services , so if in your YAML you are using a Docker image or a personalized images and you don't have them on local, docker will try download them from the principal registry in your configuration, in this case Google Cloud Registry and if your are offline you will get an error.

RuntimeError: Can not reuse socket after connection was closed using docker

I am following this tutorial to run Ethereum crawler using Docker on window 10 using docker
after executing
$ MYSQL_DATA_PATH="$HOME/indexer-data/mysql" GETH_DATA_PATH="$HOME/indexer-data/geth"docker-compose up
in this line i got error:
409 Client Error: Conflict for url: http+docker://localnpipe/v1.25/containers/ee2b46142bae704d4963853e22a77ba896a8f841120ecf5ac97befca91847672/attach?logs=0&stdout=1&stderr=1&stream=1
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "threading.py", line 916, in
_bootstrap_inner File "threading.py", line 864, in run File "compose\cli\log_printer.py", line 233, in watch_events File
"compose\container.py", line 215, in attach_log_stream File
"compose\container.py", line 307, in attach File
"site-packages\docker\utils\decorators.py", line 19, in wrapped File
"site-packages\docker\api\container.py", line 61, in attach File
"site-packages\docker\api\client.py", line 400, in _read_from_socket
File "site-packages\docker\api\client.py", line 311, in
_get_raw_response_socket File "site-packages\docker\api\client.py", line 263, in _raise_for_status File
"site-packages\docker\errors.py", line 19, in
create_api_error_from_http_exception File
"site-packages\requests\models.py", line 880, in json File
"site-packages\requests\models.py", line 828, in content File
"site-packages\requests\models.py", line 750, in generate File
"site-packages\urllib3\response.py", line 496, in stream File
"site-packages\urllib3\response.py", line 444, in read File
"http\client.py", line 449, in read File "http\client.py", line 493,
in readinto File "site-packages\docker\transport\npipesocket.py",
line 209, in readinto File
"site-packages\docker\transport\npipesocket.py", line 20, in wrapped
RuntimeError: Can not reuse socket after connection was closed.
git config --global user.name "usrname"
git config --global user.password "paswd"
git clone https://github.com/usrame/eth-indexer.git
cd eth-indexer
touch .env
echo "MYSQL_DATA_PATH=~/indexer-data/mysql" > .env
echo "GETH_DATA_PATH=~/indexer-data/geth" > .env
git add -f .env
docker-compose build
mkdir -p ~/indexer-data/mysql ~/indexer-data/geth
# Create database sechema
MYSQL_DATA_PATH="$HOME/indexer-data/mysql" docker-compose up idx-database idx-migration
MYSQL_DATA_PATH="$HOME/indexer-data/mysql" GETH_DATA_PATH="$HOME/indexer-data/geth" docker-compose up

Dart VM fails to build on windows

I am trying to buiil Dart-VM on windows, I follow the steps as described here
https://github.com/dart-lang/sdk/wiki/Building
When I run the build.py command as below:
.\tools\build.py --mode release --arch x64 create_sdk
I get the following error:
gn gen --check in out\ReleaseX64
Traceback (most recent call last):
File "D:\ops\dart\sdk\tools\gn.py", line 436, in <module>
sys.exit(main(sys.argv))
File "D:\ops\dart\sdk\tools\gn.py", line 423, in main
results = pool.map(run_command, commands, chunksize=1)
File "C:\app\Python27\lib\multiprocessing\pool.py", line 251, in map
return self.map_async(func, iterable, chunksize).get()
File "C:\app\Python27\lib\multiprocessing\pool.py", line 567, in get
raise self._value
WindowsError: [Error 2] System cannot find file。
Tried to run GN, but it failed. Try running it manually:
$ python D:\ops\dart\sdk\tools\gn.py -m release -a x64 --os host -v
Traceback (most recent call last):
File "D:\ops\dart\sdk\tools\build.py", line 658, in <module>
sys.exit(Main())
File "D:\ops\dart\sdk\tools\build.py", line 651, in Main
mode, arch, cross_build) != 0:
File "D:\ops\dart\sdk\tools\build.py", line 491, in BuildOneConfig
args = BuildNinjaCommand(options, target, target_os, mode, arch)
File "D:\ops\dart\sdk\tools\build.py", line 473, in BuildNinjaCommand
if UseGoma(out_dir):
File "D:\ops\dart\sdk\tools\build.py", line 431, in UseGoma
return 'use_goma = true' in open(args_gn, 'r').read()
IOError: [Errno 2] No such file or directory: 'out\\ReleaseX64\\args.gn'
It seems missing the args.gn file in out\ReleaseX64 folder. but I cannot find args.gn in Dart source folders. Is it generated during building process? whether I do wrong steps led to no such file generated?

WebRTC source download fails

When trying to download WebRTC latest source for iOS on Mac, its failing with the following error:
Running: gclient sync --with_branch_heads
Error: Command '/usr/bin/python src/webrtc/build/gyp_webrtc -Dextra_gyp_flag=0' returned non-zero exit status 1 in /Users/Latest
Traceback (most recent call last):
File "/Users/Latest/depot_tools/fetch.py", line 346, in <module>
sys.exit(main())
File "/Users/Latest/depot_tools/fetch.py", line 341, in main
return run(options, spec, root)
File "/Users/Latest/depot_tools/fetch.py", line 335, in run
return checkout.init()
File "/Users/Latest/depot_tools/fetch.py", line 142, in init
self.run_gclient(*sync_cmd)
File "/Users/Latest/depot_tools/fetch.py", line 76, in run_gclient
return self.run(cmd_prefix + cmd, **kwargs)
File "/Users/Latest/depot_tools/fetch.py", line 66, in run
return subprocess.check_output(cmd, **kwargs)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 573, in check_output
raise CalledProcessError(retcode, cmd, output=output)
subprocess.CalledProcessError: Command '('gclient', 'sync', '--with_branch_heads')' returned non-zero exit status 2
I'm following the same steps given here
Any solution?
do you install depot tools?
(1) Prerequisite Software
(2) install-depot-tools

Resources