Build multi-arch docker image - docker
I try to build an ansible image for amd64 and arm64 using docker buildx but my build always fails, it's like the builder can't support another arch than the one running on the current hardware I am running debian and I installed qemu-user-static and binfmt-support so docker buildx ls gives the following result
NAME/NODE DRIVER/ENDPOINT STATUS PLATFORMS
hopeful_wilson * docker-container
hopeful_wilson0 unix:///var/run/docker.sock running linux/amd64, linux/amd64/v2, linux/amd64/v3, linux/arm64, linux/riscv64, linux/ppc64le, linux/s390x, linux/386, linux/mips64le, linux/mips64, linux/arm/v7, linux/arm/v6
inspiring_lamarr docker-container
inspiring_lamarr0 unix:///var/run/docker.sock running linux/amd64, linux/amd64/v2, linux/amd64/v3, linux/arm64, linux/riscv64, linux/ppc64le, linux/s390x, linux/386, linux/mips64le, linux/mips64, linux/arm/v7, linux/arm/v6
default docker
default default running linux/amd64, linux/386, linux/arm64, linux/riscv64, linux/ppc64le, linux/s390x, linux/arm/v7, linux/arm/v6
and my dockerfile
FROM python:alpine3.15
ADD . /tmp
WORKDIR /tmp
RUN adduser -D -s /bin/sh -h /home/ansible ansible ansible
RUN apk update && apk add openssh-client bash
RUN rm -rf /var/cache/apk/*
RUN python3 -m pip install --upgrade pip
RUN python3 -m pip install -r requirements.txt
# RUN python3 -m pip install ansible
RUN sed -i "s#/bin/sh#/bin/bash#g" /etc/passwd
WORKDIR /
USER ansible
to build I run
docker buildx build --platform linux/amd64,linux/arm64 -t ansible-multi-arch .
And here the result of my build
WARNING: No output specified for docker-container driver. Build result will only remain in the build cache. To push result image into registry use --push or to load image into docker use --load
[+] Building 121.4s (24/25)
=> [internal] booting buildkit 5.8s
=> => pulling image moby/buildkit:buildx-stable-1 1.9s
=> => creating container buildx_buildkit_competent_bohr0 3.9s
=> [internal] load build definition from Dockerfile 0.8s
=> => transferring dockerfile: 419B 0.0s
=> [internal] load .dockerignore 0.9s
=> => transferring context: 2B 0.0s
=> [linux/amd64 internal] load metadata for docker.io/library/python:alpine3.15 3.0s
=> [linux/arm64 internal] load metadata for docker.io/library/python:alpine3.15 3.2s
=> [auth] library/python:pull token for registry-1.docker.io 0.0s
=> [linux/arm64 1/10] FROM docker.io/library/python:alpine3.15#sha256:74d722200c8cd876dcbd5cfb1d093c916e85d4318f051c2f3cfe5067c27cfbd5 12.9s
=> => resolve docker.io/library/python:alpine3.15#sha256:74d722200c8cd876dcbd5cfb1d093c916e85d4318f051c2f3cfe5067c27cfbd5 0.3s
=> => sha256:7025a0ca8e87cb11983c19a8a09007c64a9ebe5d2f9efe96b36658d2221721b7 231B / 231B 0.5s
=> => sha256:a1b0595ea6d26aa03a510d790d1fde94888af6ca5914982b6c9b6fd03c7a620c 3.04MB / 3.04MB 1.5s
=> => sha256:938e9f93fe23985a1f4d1d090e4ced82c31b2fabd4922b7761e42cfe7b56b9d7 12.67MB / 12.67MB 3.3s
=> => sha256:5e18021c0d0bf0a6f79a44bbbd33f12adeb2ef1358b140d8e7ef36e20c1e63b3 682.39kB / 682.39kB 0.7s
=> => sha256:47517142f6ba87eca6b7bdca1e0df160b74671c81e4b9605dad38c1862a43be3 2.72MB / 2.72MB 1.2s
=> => extracting sha256:47517142f6ba87eca6b7bdca1e0df160b74671c81e4b9605dad38c1862a43be3 0.3s
=> => extracting sha256:5e18021c0d0bf0a6f79a44bbbd33f12adeb2ef1358b140d8e7ef36e20c1e63b3 0.5s
=> => extracting sha256:938e9f93fe23985a1f4d1d090e4ced82c31b2fabd4922b7761e42cfe7b56b9d7 1.6s
=> => extracting sha256:7025a0ca8e87cb11983c19a8a09007c64a9ebe5d2f9efe96b36658d2221721b7 1.7s
=> => extracting sha256:a1b0595ea6d26aa03a510d790d1fde94888af6ca5914982b6c9b6fd03c7a620c 1.6s
=> [internal] load build context 1.4s
=> => transferring context: 49.12kB 0.0s
=> [linux/amd64 1/10] FROM docker.io/library/python:alpine3.15#sha256:74d722200c8cd876dcbd5cfb1d093c916e85d4318f051c2f3cfe5067c27cfbd5 9.0s
=> => resolve docker.io/library/python:alpine3.15#sha256:74d722200c8cd876dcbd5cfb1d093c916e85d4318f051c2f3cfe5067c27cfbd5 0.3s
=> => sha256:4211f440f0679002fb62db619c32ebb8894e25c77a19249fdf742fd4dbfb6555 3.04MB / 3.04MB 0.7s
=> => sha256:e8792c1c2edc87ab51a97c35dd511344734a625d1e67e3fd27dcfaa37ebe8eaf 231B / 231B 0.3s
=> => sha256:abed0206f3914209d0e7a549b92f3b0c85b421285ab998e63ea64d093f71289f 681.67kB / 681.67kB 1.3s
=> => sha256:9621f1afde84053b2f9b6ff34fc7f7460712247c01cbab483c5fa7132cf782ca 2.82MB / 2.82MB 1.1s
=> => sha256:0b0ae0fe5b972748ea6475feec4cd2238797fd89b8870a9f4a572f29488e5f88 12.58MB / 12.58MB 2.8s
=> => extracting sha256:9621f1afde84053b2f9b6ff34fc7f7460712247c01cbab483c5fa7132cf782ca 0.4s
=> => extracting sha256:abed0206f3914209d0e7a549b92f3b0c85b421285ab998e63ea64d093f71289f 0.7s
=> => extracting sha256:0b0ae0fe5b972748ea6475feec4cd2238797fd89b8870a9f4a572f29488e5f88 0.5s
=> => extracting sha256:e8792c1c2edc87ab51a97c35dd511344734a625d1e67e3fd27dcfaa37ebe8eaf 0.3s
=> => extracting sha256:4211f440f0679002fb62db619c32ebb8894e25c77a19249fdf742fd4dbfb6555 1.6s
=> [linux/amd64 2/10] ADD . /tmp 7.0s
=> [linux/arm64 2/10] ADD . /tmp 3.5s
=> [linux/arm64 3/10] WORKDIR /tmp 1.9s
=> [linux/amd64 3/10] WORKDIR /tmp 1.9s
=> [linux/arm64 4/10] RUN adduser -D -s /bin/sh -h /home/ansible ansible ansible 1.6s
=> [linux/amd64 4/10] RUN adduser -D -s /bin/sh -h /home/ansible ansible ansible 1.6s
=> [linux/arm64 5/10] RUN apk update && apk add openssh-client bash 4.3s
=> [linux/amd64 5/10] RUN apk update && apk add openssh-client bash 2.6s
=> [linux/amd64 6/10] RUN rm -rf /var/cache/apk/* 0.5s
=> [linux/amd64 7/10] RUN python3 -m pip install --upgrade pip 5.9s
=> [linux/arm64 6/10] RUN rm -rf /var/cache/apk/* 1.2s
=> [linux/arm64 7/10] RUN python3 -m pip install --upgrade pip 23.5s
=> [linux/amd64 8/10] RUN python3 -m pip install -r requirements.txt 43.1s
=> ERROR [linux/arm64 8/10] RUN python3 -m pip install -r requirements.txt 61.9s
=> [linux/amd64 9/10] RUN sed -i "s#/bin/sh#/bin/bash#g" /etc/passwd 1.4s
------
> [linux/arm64 8/10] RUN python3 -m pip install -r requirements.txt:
#0 3.679 Collecting ansible==6.3.0
#0 3.820 Downloading ansible-6.3.0-py3-none-any.whl (41.0 MB)
#0 8.925 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.0/41.0 MB 5.1 MB/s eta 0:00:00
#0 10.81 Collecting ansible-core==2.13.3
#0 10.84 Downloading ansible_core-2.13.3-py3-none-any.whl (2.1 MB)
#0 11.12 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 7.8 MB/s eta 0:00:00
#0 12.29 Collecting cffi==1.15.1
#0 12.32 Downloading cffi-1.15.1.tar.gz (508 kB)
#0 12.77 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 508.5/508.5 kB 1.2 MB/s eta 0:00:00
#0 13.15 Preparing metadata (setup.py): started
#0 17.85 Preparing metadata (setup.py): finished with status 'done'
#0 18.91 Collecting cryptography==37.0.4
#0 18.95 Downloading cryptography-37.0.4-cp36-abi3-musllinux_1_1_aarch64.whl (4.1 MB)
#0 19.47 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.1/4.1 MB 8.2 MB/s eta 0:00:00
#0 20.18 Collecting Jinja2==3.1.2
#0 20.22 Downloading Jinja2-3.1.2-py3-none-any.whl (133 kB)
#0 20.68 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.1/133.1 kB 291.5 kB/s eta 0:00:00
#0 21.19 Collecting MarkupSafe==2.1.1
#0 21.22 Downloading MarkupSafe-2.1.1-cp310-cp310-musllinux_1_1_aarch64.whl (30 kB)
#0 22.11 Collecting packaging==21.3
#0 22.15 Downloading packaging-21.3-py3-none-any.whl (40 kB)
#0 22.48 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 40.8/40.8 kB 96.3 kB/s eta 0:00:00
#0 23.14 Collecting pycparser==2.21
#0 23.17 Downloading pycparser-2.21-py2.py3-none-any.whl (118 kB)
#0 23.25 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 118.7/118.7 kB 2.6 MB/s eta 0:00:00
#0 23.55 Collecting pyparsing==3.0.9
#0 23.58 Downloading pyparsing-3.0.9-py3-none-any.whl (98 kB)
#0 23.66 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 98.3/98.3 kB 2.1 MB/s eta 0:00:00
#0 23.94 Collecting PyYAML==6.0
#0 23.97 Downloading PyYAML-6.0.tar.gz (124 kB)
#0 24.25 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 125.0/125.0 kB 479.8 kB/s eta 0:00:00
#0 25.03 Installing build dependencies: started
#0 39.09 Installing build dependencies: finished with status 'done'
#0 39.09 Getting requirements to build wheel: started
#0 46.71 Getting requirements to build wheel: finished with status 'done'
#0 46.73 Preparing metadata (pyproject.toml): started
#0 48.48 Preparing metadata (pyproject.toml): finished with status 'done'
#0 48.66 Collecting resolvelib==0.8.1
#0 48.70 Downloading resolvelib-0.8.1-py2.py3-none-any.whl (16 kB)
#0 49.30 Building wheels for collected packages: cffi, PyYAML
#0 49.30 Building wheel for cffi (setup.py): started
#0 50.82 Building wheel for cffi (setup.py): finished with status 'error'
#0 50.86 error: subprocess-exited-with-error
#0 50.86
#0 50.86 × python setup.py bdist_wheel did not run successfully.
#0 50.86 │ exit code: 1
#0 50.86 ╰─> [47 lines of output]
#0 50.86
#0 50.86 No working compiler found, or bogus compiler options passed to
#0 50.86 the compiler from Python's standard "distutils" module. See
#0 50.86 the error messages above. Likely, the problem is not related
#0 50.86 to CFFI but generic to the setup.py of any Python package that
#0 50.86 tries to compile C code. (Hints: on OS/X 10.8, for errors about
#0 50.86 -mno-fused-madd see http://stackoverflow.com/questions/22313407/
#0 50.86 Otherwise, see https://wiki.python.org/moin/CompLangPython or
#0 50.86 the IRC channel #python on irc.libera.chat.)
#0 50.86
#0 50.86 Trying to continue anyway. If you are trying to install CFFI from
#0 50.86 a build done in a different context, you can ignore this warning.
#0 50.86
#0 50.86 /usr/local/lib/python3.10/site-packages/setuptools/config/setupcfg.py:463: SetuptoolsDeprecationWarning: The license_file parameter is deprecated, use license_files instead.
#0 50.86 warnings.warn(msg, warning_class)
#0 50.86 running bdist_wheel
#0 50.86 running build
#0 50.86 running build_py
#0 50.86 creating build
#0 50.86 creating build/lib.linux-aarch64-cpython-310
#0 50.86 creating build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/setuptools_ext.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/error.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/vengine_cpy.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/pkgconfig.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/commontypes.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/backend_ctypes.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/lock.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/vengine_gen.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/api.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/verifier.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/model.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/cparser.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/cffi_opcode.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/__init__.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/recompiler.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/ffiplatform.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/_cffi_include.h -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/parse_c_type.h -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/_embedding.h -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 copying cffi/_cffi_errors.h -> build/lib.linux-aarch64-cpython-310/cffi
#0 50.86 running build_ext
#0 50.86 building '_cffi_backend' extension
#0 50.86 creating build/temp.linux-aarch64-cpython-310
#0 50.86 creating build/temp.linux-aarch64-cpython-310/c
#0 50.86 gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -DTHREAD_STACK_SIZE=0x100000 -fPIC -DFFI_BUILDING=1 -I/usr/include/ffi -I/usr/include/libffi -I/usr/local/include/python3.10 -c c/_cffi_backend.c -o build/temp.linux-aarch64-cpython-310/c/_cffi_backend.o
#0 50.86 error: command 'gcc' failed: No such file or directory
#0 50.86 [end of output]
#0 50.86
#0 50.86 note: This error originates from a subprocess, and is likely not a problem with pip.
#0 50.87 ERROR: Failed building wheel for cffi
#0 50.87 Running setup.py clean for cffi
#0 52.15 Building wheel for PyYAML (pyproject.toml): started
#0 54.17 Building wheel for PyYAML (pyproject.toml): finished with status 'done'
#0 54.17 Created wheel for PyYAML: filename=PyYAML-6.0-cp310-cp310-linux_aarch64.whl size=45335 sha256=6180755536e9685dbf4baacd3bfdc04e71f7c56e2467f12a2f994e33065dd5bb
#0 54.18 Stored in directory: /root/.cache/pip/wheels/1d/f3/b4/4aea0992adbed14b36ce9c3857d3707c762a4374479230685d
#0 54.19 Successfully built PyYAML
#0 54.19 Failed to build cffi
#0 55.52 Installing collected packages: resolvelib, PyYAML, pyparsing, pycparser, MarkupSafe, packaging, Jinja2, cffi, cryptography, ansible-core, ansible
#0 58.06 Running setup.py install for cffi: started
#0 59.47 Running setup.py install for cffi: finished with status 'error'
#0 59.50 error: subprocess-exited-with-error
#0 59.50
#0 59.50 × Running setup.py install for cffi did not run successfully.
#0 59.50 │ exit code: 1
#0 59.50 ╰─> [49 lines of output]
#0 59.50
#0 59.50 No working compiler found, or bogus compiler options passed to
#0 59.50 the compiler from Python's standard "distutils" module. See
#0 59.50 the error messages above. Likely, the problem is not related
#0 59.50 to CFFI but generic to the setup.py of any Python package that
#0 59.50 tries to compile C code. (Hints: on OS/X 10.8, for errors about
#0 59.50 -mno-fused-madd see http://stackoverflow.com/questions/22313407/
#0 59.50 Otherwise, see https://wiki.python.org/moin/CompLangPython or
#0 59.50 the IRC channel #python on irc.libera.chat.)
#0 59.50
#0 59.50 Trying to continue anyway. If you are trying to install CFFI from
#0 59.50 a build done in a different context, you can ignore this warning.
#0 59.50
#0 59.50 /usr/local/lib/python3.10/site-packages/setuptools/config/setupcfg.py:463: SetuptoolsDeprecationWarning: The license_file parameter is deprecated, use license_files instead.
#0 59.50 warnings.warn(msg, warning_class)
#0 59.50 running install
#0 59.50 /usr/local/lib/python3.10/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
#0 59.50 warnings.warn(
#0 59.50 running build
#0 59.50 running build_py
#0 59.50 creating build
#0 59.50 creating build/lib.linux-aarch64-cpython-310
#0 59.50 creating build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/setuptools_ext.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/error.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/vengine_cpy.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/pkgconfig.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/commontypes.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/backend_ctypes.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/lock.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/vengine_gen.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/api.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/verifier.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/model.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/cparser.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/cffi_opcode.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/__init__.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/recompiler.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/ffiplatform.py -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/_cffi_include.h -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/parse_c_type.h -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/_embedding.h -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 copying cffi/_cffi_errors.h -> build/lib.linux-aarch64-cpython-310/cffi
#0 59.50 running build_ext
#0 59.50 building '_cffi_backend' extension
#0 59.50 creating build/temp.linux-aarch64-cpython-310
#0 59.50 creating build/temp.linux-aarch64-cpython-310/c
#0 59.50 gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -DTHREAD_STACK_SIZE=0x100000 -fPIC -DFFI_BUILDING=1 -I/usr/include/ffi -I/usr/include/libffi -I/usr/local/include/python3.10 -c c/_cffi_backend.c -o build/temp.linux-aarch64-cpython-310/c/_cffi_backend.o
#0 59.50 error: command 'gcc' failed: No such file or directory
#0 59.50 [end of output]
#0 59.50
#0 59.50 note: This error originates from a subprocess, and is likely not a problem with pip.
#0 59.51 error: legacy-install-failure
#0 59.51
#0 59.51 × Encountered error while trying to install package.
#0 59.51 ╰─> cffi
#0 59.51
#0 59.51 note: This is an issue with the package mentioned above, not pip.
#0 59.51 hint: See above for output from the failure.
------
Dockerfile:10
--------------------
8 | RUN rm -rf /var/cache/apk/*
9 | RUN python3 -m pip install --upgrade pip
10 | >>> RUN python3 -m pip install -r requirements.txt
11 | # RUN python3 -m pip install ansible
12 | RUN sed -i "s#/bin/sh#/bin/bash#g" /etc/passwd
--------------------
error: failed to solve: process "/bin/sh -c python3 -m pip install -r requirements.txt" did not complete successfully: exit code: 1
Any Idea ?
Thank you
When the buildx inspect shows the platform list here:
hopeful_wilson * docker-container
hopeful_wilson0 unix:///var/run/docker.sock running linux/amd64, linux/amd64/v2, linux/amd64/v3, linux/arm64, linux/riscv64, linux/ppc64le, linux/s390x, linux/386, linux/mips64le, linux/mips64, linux/arm/v7, linux/arm/v6
the qemu and buildx configuration is properly done. Since the build gets past several RUN steps, that verifies it even further. The build step that did fail threw a few errors including:
#0 50.86 No working compiler found, or bogus compiler options passed to
#0 50.86 the compiler from Python's standard "distutils" module. See
#0 50.86 the error messages above. Likely, the problem is not related
#0 50.86 to CFFI but generic to the setup.py of any Python package that
#0 50.86 tries to compile C code. (Hints: on OS/X 10.8, for errors about
#0 50.86 -mno-fused-madd see http://stackoverflow.com/questions/22313407/
#0 50.86 Otherwise, see https://wiki.python.org/moin/CompLangPython or
#0 50.86 the IRC channel #python on irc.libera.chat.)
#0 50.86
#0 50.86 Trying to continue anyway. If you are trying to install CFFI from
#0 50.86 a build done in a different context, you can ignore this warning.
...
#0 50.86 gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -DTHREAD_STACK_SIZE=0x100000 -fPIC -DFFI_BUILDING=1 -I/usr/include/ffi -I/usr/include/libffi -I/usr/local/include/python3.10 -c c/_cffi_backend.c -o build/temp.linux-aarch64-cpython-310/c/_cffi_backend.o
#0 50.86 error: command 'gcc' failed: No such file or directory
#0 50.86 [end of output]
which indicates the build needed gcc and failed to perform the compile without it. You can attempt to change the package install line to include gcc:
RUN apk add openssh-client bash gcc
But that's no guarantee the underlying application supports arm64. If you have that other platform available, it's useful to attempt to build directly on that platform with a native docker build before attempting to emulate the system and run a docker buildx build from another architecture. From the comments, it sounds like that has failed, and you'll need to work with the application to fix that.
Related
How do I avoid a "x509: certificate signed by unknown authority" when doing a "go get download" from an alpine container?
I am trying to build coredns from scratch with the following Dockerfile: FROM golang:alpine SHELL [ "/bin/sh", "-ec" ] RUN apk update && apk add --no-cache git make ca-certificates openssl && update-ca-certificates RUN git clone https://github.com/coredns/coredns.git WORKDIR /go/coredns RUN go get download RUN make When I run docker build --no-cache --progress=plain -t coredns . this is the output and error I am getting: #1 [internal] load build definition from Dockerfile #1 sha256:5b65661f68f3298655d88d1e83c5014118e9d278e724f83e2f8d968a8f11fe27 #1 transferring dockerfile: 619B done #1 DONE 0.0s #2 [internal] load .dockerignore #2 sha256:2e78fdc563f1836b7815b48a445b2878de57404b5573a93080990b3c49e92f8f #2 transferring context: 2B done #2 DONE 0.0s #3 [internal] load metadata for docker.io/library/golang:alpine #3 sha256:299327d28eff710219f2e24597cfa9b226e8b1b0dc90f9e2122573004cfe837f #3 DONE 0.5s #4 [1/6] FROM docker.io/library/golang:alpine#sha256:2381c1e5f8350a901597d633b2e517775eeac7a6682be39225a93b22cfd0f8bb #4 sha256:bcd1e622e133c928bad4175797b9e323eb9ac29a1d90fbb12f2566da7e868b8f #4 CACHED #5 [2/6] RUN apk update && apk add --no-cache git make ca-certificates openssl && update-ca-certificates #5 sha256:6dd058a5b7f80d591599c7ab466c65cf38e8d5d1b7ddb8f4d2e5d1c0e79a32f0 #5 0.198 fetch https://dl-cdn.alpinelinux.org/alpine/v3.17/main/x86_64/APKINDEX.tar.gz #5 0.847 fetch https://dl-cdn.alpinelinux.org/alpine/v3.17/community/x86_64/APKINDEX.tar.gz #5 1.224 v3.17.1-21-gf40c2ce77f [https://dl-cdn.alpinelinux.org/alpine/v3.17/main] #5 1.224 v3.17.1-23-g06668be47f [https://dl-cdn.alpinelinux.org/alpine/v3.17/community] #5 1.224 OK: 17813 distinct packages available #5 1.280 fetch https://dl-cdn.alpinelinux.org/alpine/v3.17/main/x86_64/APKINDEX.tar.gz #5 1.753 fetch https://dl-cdn.alpinelinux.org/alpine/v3.17/community/x86_64/APKINDEX.tar.gz #5 2.043 (1/8) Installing brotli-libs (1.0.9-r9) #5 2.120 (2/8) Installing nghttp2-libs (1.51.0-r0) #5 2.182 (3/8) Installing libcurl (7.87.0-r1) #5 2.257 (4/8) Installing libexpat (2.5.0-r0) #5 2.314 (5/8) Installing pcre2 (10.42-r0) #5 2.387 (6/8) Installing git (2.38.2-r0) #5 2.622 (7/8) Installing make (4.3-r1) #5 2.686 (8/8) Installing openssl (3.0.7-r2) #5 2.763 Executing busybox-1.35.0-r29.trigger #5 2.774 OK: 17 MiB in 24 packages #5 DONE 2.9s #6 [3/6] RUN git clone https://github.com/coredns/coredns.git #6 sha256:aae1eab60ab1f0ffb8d8a48bd03ef02b93bb537b82f1bd4285cfcb2731e19ff4 #6 0.264 Cloning into 'coredns'... #6 DONE 14.1s #7 [4/6] WORKDIR /go/coredns #7 sha256:2291c568fa24f46c6531c6e7d41d5e1150d10485b34e88a85f81542e26295acb #7 DONE 0.0s #8 [5/6] RUN go get download #8 sha256:b2878fe66127be7ffe2e7f4e1f6b538679aebda0abffdd20b14bf928ef23957f #8 3.603 go: cloud.google.com/go/compute#v1.14.0: Get "https://proxy.golang.org/cloud.google.com/go/compute/#v/v1.14.0.mod": x509: certificate signed by unknown authority #8 ERROR: executor failed running [/bin/sh -ec go get download]: exit code: 1 ------ > [5/6] RUN go get download: ------ executor failed running [/bin/sh -ec go get download]: exit code: 1 I've googled my heart out trying to figure out how to get past the "x509: certificate signed by unknown authority" error. Any help is appreciated.
It looks like the issue was caused by the Cisco AnyConnect client on my Mac. You can uninstall Cisco AnyConect or add the following to your Dockerfile. RUN wget http://www.cisco.com/security/pki/certs/ciscoumbrellaroot.cer RUN openssl x509 -inform DER -in ciscoumbrellaroot.cer -out ciscoumbrellaroot.crt RUN cp ciscoumbrellaroot.crt /usr/local/share/ca-certificates/ciscoumbrellaroot.crt RUN update-ca-certificates I found the answer here.
Some unresolved dependencies have extra attributes
I used dockerfile below to build play-samples-play-scala-hello-world-tutorial(https://github.com/playframework/play-samples/tree/2.8.x/play-scala-hello-world-tutorial) I want to build the tutorial by dockerfile, but got an error like downloading issue. I wonder whether this is issue with network or dockerfile. FROM openjdk:11 ENV TZ=America/Los_Angeles RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone ENV HULU_ENV=staging ADD . /play-samples-play-scala-hello-world-tutorial WORKDIR /play-samples-play-scala-hello-world-tutorial RUN curl -L https://github.com/sbt/sbt/releases/download/v1.5.2/sbt-1.5.2.tgz -o sbt.tgz RUN tar xf sbt.tgz RUN ./sbt/bin/sbt clean stage but got error below and failed to build docker build . -f Dockerfile #11 0.595 copying runtime jar... #11 69.16 [warn] Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes. #11 69.16 [warn] com.typesafe.sbt:sbt-js-engine:1.2.3 (scalaVersion=2.12, sbtVersion=1.0) #11 69.16 [warn] org.foundweekends.giter8:sbt-giter8-scaffold:0.11.0 (sbtVersion=1.0, scalaVersion=2.12) #11 69.16 [warn] com.typesafe.sbt:sbt-native-packager:1.5.2 (scalaVersion=2.12, sbtVersion=1.0) #11 69.16 [warn] com.lightbend.sbt:sbt-javaagent:0.1.5 (scalaVersion=2.12, sbtVersion=1.0) #11 69.16 [warn] com.typesafe.sbt:sbt-twirl:1.5.1 (scalaVersion=2.12, sbtVersion=1.0) #11 69.16 [warn] com.typesafe.sbt:sbt-web:1.4.4 (scalaVersion=2.12, sbtVersion=1.0) #11 69.16 [warn] #11 69.16 [warn] Note: Unresolved dependencies path: #11 69.25 [error] Error downloading org.foundweekends.giter8:sbt-giter8-scaffold;sbtVersion=1.0;scalaVersion=2.12:0.11.0 #11 69.25 [error] Not found #11 69.25 [error] Not found #11 69.25 [error] not found: https://repo1.maven.org/maven2/org/foundweekends/giter8/sbt-giter8-scaffold_2.12_1.0/0.11.0/sbt-giter8-scaffold-0.11.0.pom #11 69.25 [error] not found: /root/.ivy2/localorg.foundweekends.giter8/sbt-giter8-scaffold/scala_2.12/sbt_1.0/0.11.0/ivys/ivy.xml #11 69.25 [error] download error: Caught javax.net.ssl.SSLHandshakeException (Remote host terminated the handshake) while downloading https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.foundweekends.giter8/sbt-giter8-scaffold/scala_2.12/sbt_1.0/0.11.0/ivys/ivy.xml #11 69.25 [error] download error: Caught javax.net.ssl.SSLHandshakeException (Remote host terminated the handshake) while downloading https://repo.typesafe.com/typesafe/ivy-releases/org.foundweekends.giter8/sbt-giter8-scaffold/scala_2.12/sbt_1.0/0.11.0/ivys/ivy.xml any help is appreciated!
The relevant error is: download error: Caught javax.net.ssl.SSLHandshakeException (Remote host terminated the handshake) while downloading https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.foundweekends.giter8/sbt-giter8-scaffold/scala_2.12/sbt_1.0/0.11.0/ivys/ivy.xml SBT is not able to connect to the repository to download the dependencies because of some HTTPS issue. This can be caused by several things, either because your container doesn't have proper certificates or because your running behind a corporate proxy maybe and it mess with certificates. You should be able to find more help by searching on the SSLHandshakeException error.
"Docker Image" build fails due to failure in "JupyterLab" installation
I am trying to build a docker image for using a package named as Automated Recommendation Tool. As per their docker workflow I installed docker on my Ubuntu OS and then tried to build the docker image. Following is the command that I executed - DOCKER_BUILDKIT=1 \ docker build -f docker/Dockerfile \ --pull \ --no-cache \ -t jbei/art . After running for a while I got the following error - => ERROR [jupyter-install 1/1] RUN set -ex && poetry install --n 130.5s It automatically continued running and later on stopped by giving following output - #13 129.9 - `minimize`: This option controls whether your JS bundle is minified #13 129.9 during the Webpack build, which helps to improve JupyterLab's overall #13 129.9 performance. However, the minifier plugin used by Webpack is very memory #13 129.9 intensive, so turning it off may help the build finish successfully in #13 129.9 low-memory environments. #13 129.9 #13 130.0 An error occurred. #13 130.0 RuntimeError: JupyterLab failed to build #13 130.0 See the log file for details: /tmp/jupyterlab-debug-c399mxqe.log ------ executor failed running [/bin/sh -c set -ex && poetry install --no-dev --extras "docker jupyter" --no-root --no-interaction -vv && rm -rf $ART_USER/.cache/ && jupyter lab build && jupyter labextension install #jupyter-widgets/jupyterlab-manager && find ${ART_CODE} -name __pycache__ | xargs rm -rf]: exit code: 1 Anyone experienced in building Docker Images, Please help me. Following are my system specifications - Operating System: Ubuntu 22.04 LTS Kernel: Linux 5.15.0-37-generic Architecture: x86-64 Hardware Vendor: HP Hardware Model: HP Pavilion Gaming Laptop 15-ec1xxx Docker Version - Docker version 20.10.17, build 100c701 Update: Following is the complete output of the jupyter lab build command - #10 39.59 + rm -rf artuser/.cache/ #10 39.59 + jupyter lab build #10 40.91 [LabBuildApp] JupyterLab 3.4.2 #10 40.91 [LabBuildApp] Building in /usr/local/art/.venv/share/jupyter/lab #10 41.26 [LabBuildApp] Building jupyterlab assets (production, minimized) #10 41.28-Build failed. #10 122.8 Troubleshooting: If the build failed due to an out-of-memory error, you #10 122.8 may be able to fix it by disabling the `dev_build` and/or `minimize` options. #10 122.8 #10 122.8 If you are building via the `jupyter lab build` command, you can disable #10 122.8 these options like so: #10 122.8 #10 122.8 jupyter lab build --dev-build=False --minimize=False #10 122.8 #10 122.8 You can also disable these options for all JupyterLab builds by adding these #10 122.8 lines to a Jupyter config file named `jupyter_config.py`: #10 122.8 #10 122.8 c.LabBuildApp.minimize = False #10 122.8 c.LabBuildApp.dev_build = False #10 122.8 #10 122.8 If you don't already have a `jupyter_config.py` file, you can create one by #10 122.8 adding a blank file of that name to any of the Jupyter config directories. #10 122.8 The config directories can be listed by running: #10 122.8 #10 122.8 jupyter --paths #10 122.8 #10 122.8 Explanation: #10 122.8 #10 122.8 - `dev-build`: This option controls whether a `dev` or a more streamlined #10 122.8 `production` build is used. This option will default to `False` (i.e., the #10 122.8 `production` build) for most users. However, if you have any labextensions #10 122.8 installed from local files, this option will instead default to `True`. #10 122.8 Explicitly setting `dev-build` to `False` will ensure that the `production` #10 122.8 build is used in all circumstances. #10 122.8 #10 122.8 - `minimize`: This option controls whether your JS bundle is minified #10 122.8 during the Webpack build, which helps to improve JupyterLab's overall #10 122.8 performance. However, the minifier plugin used by Webpack is very memory #10 122.8 intensive, so turning it off may help the build finish successfully in #10 122.8 low-memory environments. #10 122.8 #10 122.8 An error occurred. #10 122.8 RuntimeError: JupyterLab failed to build #10 122.8 See the log file for details: /tmp/jupyterlab-debug-iguri15x.log ------ executor failed running [/bin/sh -c set -ex && poetry install --no-dev --extras "docker jupyter" --no-root --no-interaction -vv && rm -rf $ART_USER/.cache/ && jupyter lab build && jupyter labextension install #jupyter-widgets/jupyterlab-manager && find ${ART_CODE} -name __pycache__ | xargs rm -rf]: exit code: 1 From these lines I can see that it is suggesting to do the following changes in my jupyter lab build command in the Dockerfile - jupyter lab build --dev-build=False --minimize=False Will this help, or should I first check the log files which I am unable to locate as the image is not built.
docker build : getting-started tutorial => Certificate error
I'm having troubles with the getting started tutorial of docs.docker.com : https://docs.docker.com/get-started/02_our_app/ When i execute the following command : docker build -t getting-started . I get the following errors : > [2/5] RUN apk add --no-cache python2 g++ make: #5 0.412 fetch https://dl-cdn.alpinelinux.org/alpine/v3.14/main/x86_64/APKINDEX.tar.gz #5 0.551 139899692677960:error:1416F086:SSL routines:tls_process_server_certificate:certificate verify failed:ssl/statem/statem_clnt.c:1914: #5 0.552 WARNING: Ignoring https://dl-cdn.alpinelinux.org/alpine/v3.14/main: Permission denied #5 0.552 fetch https://dl-cdn.alpinelinux.org/alpine/v3.14/community/x86_64/APKINDEX.tar.gz #5 0.603 139899692677960:error:1416F086:SSL routines:tls_process_server_certificate:certificate verify failed:ssl/statem/statem_clnt.c:1914: #5 0.604 WARNING: Ignoring https://dl-cdn.alpinelinux.org/alpine/v3.14/community: Permission denied #5 0.604 ERROR: unable to select packages: #5 0.605 g++ (no such package): #5 0.605 required by: world[g++] #5 0.605 make (no such package): #5 0.605 required by: world[make] #5 0.605 python2 (no such package): #5 0.605 required by: world[python2] ------ executor failed running [/bin/sh -c apk add --no-cache python2 g++ make]: exit code: 3 I'm on Windows 10 V1909 and i downloaded WSL 2 like specified in the tutorial. EDIT : Like Hans Kilian answered, it was a VPN problem...
Like Hans Kilian answered, it was a VPN problem... Be careful if you are behind a proxy.
Docker build multiplatform asp.net core 5.0 app on armV7 with github action
I have a problem and I can't find a solution... I try to build docker image in a pipline with docker actions. My docker image build an ASP.NET core 5.0 app. All my scripts (dockerfile and my github workflow file) works fine, and I build my image with success. However, when I try to build my image on multi architectures including linux/arm/v7, I have an error. NOTE: On other architectures like linux/amd64 or linux/arm64, it works fine. My error from my logs: error: failed to solve: rpc error: code = Unknown desc = executor failed running [/bin/sh -c dotnet restore "APITemperature/APITemperature.csproj"]: exit code: 131 Here are my files and logs: Github action workflow file: name: .NET on: push: branches: [ master ] pull_request: branches: [ master ] jobs: Docker: runs-on: ubuntu-20.04 steps: - name: Checkout uses: actions/checkout#v2 - name: Set up QEMU uses: docker/setup-qemu-action#v1 - name: Set up Docker Buildx uses: docker/setup-buildx-action#v1 - name: Cache Docker layers uses: actions/cache#v2 with: path: /tmp/.buildx-cache key: ${{ runner.os }}-buildx-${{ github.sha }} restore-keys: | ${{ runner.os }}-buildx- - name: Login to DockerHub uses: docker/login-action#v1 with: username: ${{ secrets.DOCKERHUB_USERNAME }} password: ${{ secrets.DOCKERHUB_TOKEN }} - name: Build and push uses: docker/build-push-action#v2 with: context: . builder: ${{ steps.buildx.outputs.name }} platforms: linux/arm/v7 file: ./APITemperature/Dockerfile push: true tags: ${{ secrets.DOCKERHUB_USERNAME }}/apitemperature:latest cache-from: type=local,src=/tmp/.buildx-cache cache-to: type=local,dest=/tmp/.buildx-cache Dockerfile: FROM mcr.microsoft.com/dotnet/aspnet:5.0 AS base WORKDIR /app EXPOSE 5000 EXPOSE 443 FROM mcr.microsoft.com/dotnet/sdk:5.0 AS build WORKDIR /src COPY ["APITemperature/APITemperature.csproj", "APITemperature/"] RUN dotnet restore "APITemperature/APITemperature.csproj" COPY . . WORKDIR "/src/APITemperature" RUN dotnet build "APITemperature.csproj" -c Release -o /app/build FROM build AS publish RUN dotnet publish "APITemperature.csproj" -c Release -o /app/publish FROM base AS final WORKDIR /app COPY --from=publish /app/publish . ENTRYPOINT ["dotnet", "APITemperature.dll"] My pipeline job logs (with my error): 2021-04-20T14:45:05.9456541Z [command]/usr/bin/git log -1 --format='%H' 2021-04-20T14:45:05.9488487Z '15dffc581efbfb4879571675bdafcd652c5e3817' 2021-04-20T14:45:05.9685714Z ##[group]Run docker/setup-qemu-action#v1 2021-04-20T14:45:05.9686283Z with: 2021-04-20T14:45:05.9686746Z image: tonistiigi/binfmt:latest 2021-04-20T14:45:05.9687272Z platforms: all 2021-04-20T14:45:05.9687702Z ##[endgroup] 2021-04-20T14:45:06.0090840Z ##[group]Pulling binfmt Docker image 2021-04-20T14:45:06.0156093Z [command]/usr/bin/docker pull tonistiigi/binfmt:latest 2021-04-20T14:45:06.6146472Z latest: Pulling from tonistiigi/binfmt 2021-04-20T14:45:06.6841596Z b2cca52c34c9: Pulling fs layer 2021-04-20T14:45:06.6842223Z 6247d1dfaecd: Pulling fs layer 2021-04-20T14:45:06.9020179Z 6247d1dfaecd: Verifying Checksum 2021-04-20T14:45:06.9020848Z 6247d1dfaecd: Download complete 2021-04-20T14:45:06.9328963Z b2cca52c34c9: Verifying Checksum 2021-04-20T14:45:06.9329606Z b2cca52c34c9: Download complete 2021-04-20T14:45:07.3101592Z b2cca52c34c9: Pull complete 2021-04-20T14:45:07.4273645Z 6247d1dfaecd: Pull complete 2021-04-20T14:45:07.4347064Z Digest: sha256:c94a8dab5c7d9913687e77f529fc2a487dcb6aaea2f040e588cfebd778ebcb1a 2021-04-20T14:45:07.4425996Z Status: Downloaded newer image for tonistiigi/binfmt:latest 2021-04-20T14:45:07.4427120Z docker.io/tonistiigi/binfmt:latest 2021-04-20T14:45:07.4476348Z ##[endgroup] 2021-04-20T14:45:07.4477209Z ##[group]Installing QEMU static binaries 2021-04-20T14:45:07.4478809Z [command]/usr/bin/docker run --rm --privileged tonistiigi/binfmt:latest --install all 2021-04-20T14:45:08.4432706Z 2021/04/20 14:45:08 installing: riscv64 OK 2021-04-20T14:45:08.4433411Z 2021/04/20 14:45:08 installing: ppc64le OK 2021-04-20T14:45:08.4434205Z 2021/04/20 14:45:08 installing: mips64el OK 2021-04-20T14:45:08.4434883Z 2021/04/20 14:45:08 installing: mips64 OK 2021-04-20T14:45:08.4435495Z 2021/04/20 14:45:08 installing: arm64 OK 2021-04-20T14:45:08.4466628Z 2021/04/20 14:45:08 installing: arm OK 2021-04-20T14:45:08.4467265Z 2021/04/20 14:45:08 installing: s390x OK 2021-04-20T14:45:08.4467725Z { 2021-04-20T14:45:08.4468262Z "supported": [ 2021-04-20T14:45:08.4468835Z "linux/amd64", 2021-04-20T14:45:08.4469277Z "linux/arm64", 2021-04-20T14:45:08.4469788Z "linux/riscv64", 2021-04-20T14:45:08.4470263Z "linux/ppc64le", 2021-04-20T14:45:08.4470803Z "linux/s390x", 2021-04-20T14:45:08.4471733Z "linux/386", 2021-04-20T14:45:08.4472247Z "linux/mips64le", 2021-04-20T14:45:08.4472778Z "linux/mips64", 2021-04-20T14:45:08.4473223Z "linux/arm/v7", 2021-04-20T14:45:08.4473763Z "linux/arm/v6" 2021-04-20T14:45:08.4475213Z ], 2021-04-20T14:45:08.4475700Z "emulators": [ 2021-04-20T14:45:08.4476126Z "cli", 2021-04-20T14:45:08.4477262Z "llvm-10-runtime.binfmt", 2021-04-20T14:45:08.4478207Z "llvm-11-runtime.binfmt", 2021-04-20T14:45:08.4479011Z "llvm-9-runtime.binfmt", 2021-04-20T14:45:08.4479628Z "python2.7", 2021-04-20T14:45:08.4480080Z "python3.8", 2021-04-20T14:45:08.4480743Z "qemu-aarch64", 2021-04-20T14:45:08.4481356Z "qemu-arm", 2021-04-20T14:45:08.4482035Z "qemu-mips64", 2021-04-20T14:45:08.4482743Z "qemu-mips64el", 2021-04-20T14:45:08.4483395Z "qemu-ppc64le", 2021-04-20T14:45:08.4484100Z "qemu-riscv64", 2021-04-20T14:45:08.4484712Z "qemu-s390x" 2021-04-20T14:45:08.4485196Z ] 2021-04-20T14:45:08.4485744Z } 2021-04-20T14:45:08.5113338Z ##[endgroup] 2021-04-20T14:45:08.5114359Z ##[group]Extracting available platforms 2021-04-20T14:45:09.0506708Z linux/amd64,linux/arm64,linux/riscv64,linux/ppc64le,linux/s390x,linux/386,linux/mips64le,linux/mips64,linux/arm/v7,linux/arm/v6 2021-04-20T14:45:09.0528723Z ##[endgroup] 2021-04-20T14:45:09.0732277Z ##[group]Run docker/setup-buildx-action#v1 2021-04-20T14:45:09.0732899Z with: 2021-04-20T14:45:09.0733459Z driver: docker-container 2021-04-20T14:45:09.0734757Z buildkitd-flags: --allow-insecure-entitlement security.insecure --allow-insecure-entitlement network.host 2021-04-20T14:45:09.0735981Z install: false 2021-04-20T14:45:09.0736470Z use: true 2021-04-20T14:45:09.0736898Z ##[endgroup] 2021-04-20T14:45:09.4913411Z Using buildx 0.5.1 2021-04-20T14:45:09.4916188Z ##[group]Creating a new builder instance 2021-04-20T14:45:09.4938305Z [command]/usr/bin/docker buildx create --name builder-c0cc5836-daba-4418-aa07-003804840d19 --driver docker-container --buildkitd-flags --allow-insecure-entitlement security.insecure --allow-insecure-entitlement network.host --use 2021-04-20T14:45:09.5860625Z builder-c0cc5836-daba-4418-aa07-003804840d19 2021-04-20T14:45:09.5895737Z ##[endgroup] 2021-04-20T14:45:09.5897247Z ##[group]Booting builder 2021-04-20T14:45:09.5917970Z [command]/usr/bin/docker buildx inspect --bootstrap --builder builder-c0cc5836-daba-4418-aa07-003804840d19 2021-04-20T14:45:09.6752614Z #1 [internal] booting buildkit 2021-04-20T14:45:09.6754735Z #1 sha256:b81dbef062f5e95badead70b79b505149332f554041cc344fa344e1d32563422 2021-04-20T14:45:09.8263703Z #1 pulling image moby/buildkit:buildx-stable-1 2021-04-20T14:45:12.0788111Z #1 pulling image moby/buildkit:buildx-stable-1 2.3s done 2021-04-20T14:45:12.0789582Z #1 creating container buildx_buildkit_builder-c0cc5836-daba-4418-aa07-003804840d190 2021-04-20T14:45:13.1832060Z #1 creating container buildx_buildkit_builder-c0cc5836-daba-4418-aa07-003804840d190 1.2s done 2021-04-20T14:45:13.1833134Z #1 DONE 3.5s 2021-04-20T14:45:13.3455841Z Name: builder-c0cc5836-daba-4418-aa07-003804840d19 2021-04-20T14:45:13.3457016Z Driver: docker-container 2021-04-20T14:45:13.3457390Z 2021-04-20T14:45:13.3457733Z Nodes: 2021-04-20T14:45:13.3459156Z Name: builder-c0cc5836-daba-4418-aa07-003804840d190 2021-04-20T14:45:13.3460106Z Endpoint: unix:///var/run/docker.sock 2021-04-20T14:45:13.3460648Z Status: running 2021-04-20T14:45:13.3462010Z Flags: --allow-insecure-entitlement security.insecure --allow-insecure-entitlement network.host 2021-04-20T14:45:13.3463408Z Platforms: linux/amd64, linux/arm64, linux/riscv64, linux/ppc64le, linux/s390x, linux/386, linux/arm/v7, linux/arm/v6 2021-04-20T14:45:13.3464666Z ##[endgroup] 2021-04-20T14:45:13.3465308Z ##[group]Extracting available platforms 2021-04-20T14:45:13.5601180Z linux/amd64,linux/arm64,linux/riscv64,linux/ppc64le,linux/s390x,linux/386,linux/arm/v7,linux/arm/v6 2021-04-20T14:45:13.5642286Z ##[endgroup] 2021-04-20T14:45:13.5934211Z ##[group]Run actions/cache#v2 2021-04-20T14:45:13.5934690Z with: 2021-04-20T14:45:13.5935146Z path: /tmp/.buildx-cache 2021-04-20T14:45:13.5936199Z key: Linux-buildx-15dffc581efbfb4879571675bdafcd652c5e3817 2021-04-20T14:45:13.5937315Z restore-keys: Linux-buildx- 2021-04-20T14:45:13.5937860Z ##[endgroup] 2021-04-20T14:45:14.9967489Z Received 159383552 of 176752324 (90.2%), 152.0 MBs/sec 2021-04-20T14:45:15.3871940Z Received 176752324 of 176752324 (100.0%), 121.3 MBs/sec 2021-04-20T14:45:15.3878375Z Cache Size: ~169 MB (176752324 B) 2021-04-20T14:45:15.3917790Z [command]/usr/bin/tar --use-compress-program zstd -d -xf /home/runner/work/_temp/a6d0db92-9c04-4dcc-a5d6-0fec23d64dee/cache.tzst -P -C /home/runner/work/HomeAutomation/HomeAutomation 2021-04-20T14:45:15.6656176Z Cache restored successfully 2021-04-20T14:45:15.6961100Z Cache restored from key: Linux-buildx-3b667c4f1df151892c74b971af8c08cd80d9c23f 2021-04-20T14:45:15.7240763Z ##[group]Run docker/login-action#v1 2021-04-20T14:45:15.7241326Z with: 2021-04-20T14:45:15.7242356Z username: *** 2021-04-20T14:45:15.7243148Z password: *** 2021-04-20T14:45:15.7243584Z logout: true 2021-04-20T14:45:15.7244514Z ##[endgroup] 2021-04-20T14:45:15.7729702Z 🔑 Logging into Docker Hub... 2021-04-20T14:45:15.9554423Z 🎉 Login Succeeded! 2021-04-20T14:45:15.9696153Z ##[group]Run docker/build-push-action#v2 2021-04-20T14:45:15.9696817Z with: 2021-04-20T14:45:15.9697540Z context: . 2021-04-20T14:45:15.9697960Z platforms: linux/arm/v7 2021-04-20T14:45:15.9698523Z file: ./APITemperature/Dockerfile 2021-04-20T14:45:15.9699036Z push: true 2021-04-20T14:45:15.9699648Z tags: ***/apitemperature:latest 2021-04-20T14:45:15.9700276Z cache-from: type=local,src=/tmp/.buildx-cache 2021-04-20T14:45:15.9700963Z cache-to: type=local,dest=/tmp/.buildx-cache 2021-04-20T14:45:15.9701501Z load: false 2021-04-20T14:45:15.9701898Z no-cache: false 2021-04-20T14:45:15.9702304Z pull: false 2021-04-20T14:45:15.9703198Z github-token: *** 2021-04-20T14:45:15.9703639Z ##[endgroup] 2021-04-20T14:45:16.3155903Z 📣 Buildx version: 0.5.1 2021-04-20T14:45:16.3156772Z 🏃 Starting build... 2021-04-20T14:45:16.3213058Z [command]/usr/bin/docker buildx build --tag ***/apitemperature:latest --platform linux/arm/v7 --iidfile /tmp/docker-build-push-CcTisd/iidfile --cache-from type=local,src=/tmp/.buildx-cache --cache-to type=local,dest=/tmp/.buildx-cache --file ./APITemperature/Dockerfile --push . 2021-04-20T14:45:16.8524683Z #1 [internal] load build definition from Dockerfile 2021-04-20T14:45:16.8526175Z #1 sha256:31488bf030c607dfb904462da85e48090d86daaf4b0eb253a42fb88be6ab99c0 2021-04-20T14:45:16.8527446Z #1 transferring dockerfile: 789B 0.0s done 2021-04-20T14:45:16.8527982Z #1 DONE 0.0s 2021-04-20T14:45:16.8528263Z 2021-04-20T14:45:16.8528738Z #2 [internal] load .dockerignore 2021-04-20T14:45:16.8530023Z #2 sha256:eb30ca1a72778adbefeaec967db640ec2f7ae4e94bec355fdb1f3f60b544610e 2021-04-20T14:45:17.0024566Z #2 transferring context: 358B done 2021-04-20T14:45:17.0025104Z #2 DONE 0.0s 2021-04-20T14:45:17.0025374Z 2021-04-20T14:45:17.0026089Z #4 [internal] load metadata for mcr.microsoft.com/dotnet/aspnet:5.0 2021-04-20T14:45:17.0027358Z #4 sha256:df98d8a467c16a35cb8a03e607a3e5ff68ba71b77bfb872d36c8812708fb7356 2021-04-20T14:45:17.4240706Z #4 ... 2021-04-20T14:45:17.4243176Z 2021-04-20T14:45:17.4244208Z #3 [internal] load metadata for mcr.microsoft.com/dotnet/sdk:5.0 2021-04-20T14:45:17.4246842Z #3 sha256:ae057f685b8251df61adbf8401f951b0576abc0c9d0e7dc876ee9cd3c103d2e9 2021-04-20T14:45:17.4248062Z #3 DONE 0.5s 2021-04-20T14:45:17.5590668Z 2021-04-20T14:45:17.5591966Z #4 [internal] load metadata for mcr.microsoft.com/dotnet/aspnet:5.0 2021-04-20T14:45:17.5593820Z #4 sha256:df98d8a467c16a35cb8a03e607a3e5ff68ba71b77bfb872d36c8812708fb7356 2021-04-20T14:45:17.5595552Z #4 DONE 0.6s 2021-04-20T14:45:17.7094375Z 2021-04-20T14:45:17.7095297Z #5 importing cache manifest from local:2872713443050105418 2021-04-20T14:45:17.7096433Z #5 sha256:b61bc4b455b0dc74317ca25d6cbb22424609b2e04e685401fa08754d30c20978 2021-04-20T14:45:17.7097383Z #5 DONE 0.0s 2021-04-20T14:45:17.7097687Z 2021-04-20T14:45:17.7098135Z #11 [internal] load build context 2021-04-20T14:45:17.7099264Z #11 sha256:8c6fd30bbbe1675636ece6ce07377fdf2b34b451b04f77945fd61db18676f2f8 2021-04-20T14:45:17.7100434Z #11 transferring context: 39.94kB 0.0s done 2021-04-20T14:45:17.7100931Z #11 DONE 0.0s 2021-04-20T14:45:17.7101194Z 2021-04-20T14:45:17.7102381Z #6 [base 1/2] FROM mcr.microsoft.com/dotnet/aspnet:5.0#sha256:c0cc95b0d87a31401763f8c7b2a25aa106e7b45bfcaa2f302dc9d0ff5ab93fa2 2021-04-20T14:45:17.7104201Z #6 sha256:c577ebacc45259a20da4a650e09ae539e281e550d7d5370bbbdf05d3c6296734 2021-04-20T14:45:17.7106054Z #6 resolve mcr.microsoft.com/dotnet/aspnet:5.0#sha256:c0cc95b0d87a31401763f8c7b2a25aa106e7b45bfcaa2f302dc9d0ff5ab93fa2 0.0s done 2021-04-20T14:45:17.8354599Z #6 sha256:5783bc1a9017c58328bb93eceb87903ef1882950cd7f38c381358e53b0e2c876 0B / 8.59MB 0.2s 2021-04-20T14:45:17.8356262Z #6 sha256:bda878649c866936e80ab702b25ec5a159daa4cc7d3ead2f3b2d206964cc25d2 0B / 29.68MB 0.2s 2021-04-20T14:45:17.8358016Z #6 sha256:24a233bfa7351886903a6d739111134bff0c3f3bbaed67fa212c78b4975c168b 0B / 154B 0.2s 2021-04-20T14:45:17.8359691Z #6 sha256:bdc9af7b439c9ef15ef901b20a73eebdafa6a03b5881a93ea43c09d43f776f59 8.39MB / 16.12MB 0.2s 2021-04-20T14:45:17.8361690Z #6 sha256:8c6bea184b33030fb923c3c09d634b73235dec3fe2d411db9fd22bda669f2c37 2.72MB / 22.74MB 0.2s 2021-04-20T14:45:17.9652130Z #6 sha256:5783bc1a9017c58328bb93eceb87903ef1882950cd7f38c381358e53b0e2c876 2.10MB / 8.59MB 0.3s 2021-04-20T14:45:17.9653905Z #6 sha256:bdc9af7b439c9ef15ef901b20a73eebdafa6a03b5881a93ea43c09d43f776f59 16.12MB / 16.12MB 0.3s done 2021-04-20T14:45:17.9655731Z #6 sha256:8c6bea184b33030fb923c3c09d634b73235dec3fe2d411db9fd22bda669f2c37 18.87MB / 22.74MB 0.3s 2021-04-20T14:45:18.0801242Z #6 sha256:5783bc1a9017c58328bb93eceb87903ef1882950cd7f38c381358e53b0e2c876 8.59MB / 8.59MB 0.5s 2021-04-20T14:45:18.0803584Z #6 sha256:24a233bfa7351886903a6d739111134bff0c3f3bbaed67fa212c78b4975c168b 154B / 154B 0.3s done 2021-04-20T14:45:18.0805478Z #6 sha256:8c6bea184b33030fb923c3c09d634b73235dec3fe2d411db9fd22bda669f2c37 22.74MB / 22.74MB 0.4s done 2021-04-20T14:45:18.0807192Z #6 extracting sha256:8c6bea184b33030fb923c3c09d634b73235dec3fe2d411db9fd22bda669f2c37 2021-04-20T14:45:18.2299319Z #6 sha256:5783bc1a9017c58328bb93eceb87903ef1882950cd7f38c381358e53b0e2c876 8.59MB / 8.59MB 0.5s done 2021-04-20T14:45:18.3772729Z #6 sha256:bda878649c866936e80ab702b25ec5a159daa4cc7d3ead2f3b2d206964cc25d2 11.53MB / 29.68MB 0.8s 2021-04-20T14:45:18.5268666Z #6 sha256:bda878649c866936e80ab702b25ec5a159daa4cc7d3ead2f3b2d206964cc25d2 16.78MB / 29.68MB 0.9s 2021-04-20T14:45:18.6768096Z #6 sha256:bda878649c866936e80ab702b25ec5a159daa4cc7d3ead2f3b2d206964cc25d2 25.17MB / 29.68MB 1.1s 2021-04-20T14:45:19.1269561Z #6 sha256:bda878649c866936e80ab702b25ec5a159daa4cc7d3ead2f3b2d206964cc25d2 29.68MB / 29.68MB 1.4s done 2021-04-20T14:45:19.2770888Z #6 extracting sha256:8c6bea184b33030fb923c3c09d634b73235dec3fe2d411db9fd22bda669f2c37 1.1s done 2021-04-20T14:45:19.2772809Z #6 extracting sha256:bdc9af7b439c9ef15ef901b20a73eebdafa6a03b5881a93ea43c09d43f776f59 2021-04-20T14:45:19.7274376Z #6 extracting sha256:bdc9af7b439c9ef15ef901b20a73eebdafa6a03b5881a93ea43c09d43f776f59 0.4s done 2021-04-20T14:45:19.7276299Z #6 extracting sha256:bda878649c866936e80ab702b25ec5a159daa4cc7d3ead2f3b2d206964cc25d2 2021-04-20T14:45:20.3282450Z #6 extracting sha256:bda878649c866936e80ab702b25ec5a159daa4cc7d3ead2f3b2d206964cc25d2 0.6s done 2021-04-20T14:45:20.3284216Z #6 extracting sha256:24a233bfa7351886903a6d739111134bff0c3f3bbaed67fa212c78b4975c168b done 2021-04-20T14:45:20.3285735Z #6 extracting sha256:5783bc1a9017c58328bb93eceb87903ef1882950cd7f38c381358e53b0e2c876 2021-04-20T14:45:20.4783763Z #6 extracting sha256:5783bc1a9017c58328bb93eceb87903ef1882950cd7f38c381358e53b0e2c876 0.2s done 2021-04-20T14:45:20.4784882Z #6 DONE 2.8s 2021-04-20T14:45:20.4785530Z 2021-04-20T14:45:20.4786556Z #9 [build 1/7] FROM mcr.microsoft.com/dotnet/sdk:5.0#sha256:85ea9832ae26c70618418cf7c699186776ad066d88770fd6fd1edea9b260379a 2021-04-20T14:45:20.4788071Z #9 sha256:0d00a70370f81c372582259015e7fbdccd877b07d4409b05152d14e9fa29e545 2021-04-20T14:45:20.4789599Z #9 resolve mcr.microsoft.com/dotnet/sdk:5.0#sha256:85ea9832ae26c70618418cf7c699186776ad066d88770fd6fd1edea9b260379a 0.0s done 2021-04-20T14:45:20.4791322Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 2.7s 2021-04-20T14:45:20.4792858Z #9 sha256:f123f733188d3ed3512c733a9c57c6b50a41ce614078c32d17f9e59656929689 24.14MB / 24.14MB 0.7s done 2021-04-20T14:45:20.4794481Z #9 sha256:2d4575e6b4d4d83672f4a97ae481e068968b43e5bba3bc3e4816069609b05fdd 12.12MB / 12.12MB 0.6s done 2021-04-20T14:45:20.4795954Z #9 extracting sha256:f123f733188d3ed3512c733a9c57c6b50a41ce614078c32d17f9e59656929689 2021-04-20T14:45:22.7187857Z #9 ... 2021-04-20T14:45:22.7189696Z 2021-04-20T14:45:22.7190225Z #7 [base 2/2] WORKDIR /app 2021-04-20T14:45:22.7191404Z #7 sha256:5c23f9805dd85e2afec193446c0f683d7c96032a135115b14ddfdb6df4958bfd 2021-04-20T14:45:22.7192508Z #7 DONE 2.3s 2021-04-20T14:45:22.7193155Z 2021-04-20T14:45:22.7194660Z #9 [build 1/7] FROM mcr.microsoft.com/dotnet/sdk:5.0#sha256:85ea9832ae26c70618418cf7c699186776ad066d88770fd6fd1edea9b260379a 2021-04-20T14:45:22.7196568Z #9 sha256:0d00a70370f81c372582259015e7fbdccd877b07d4409b05152d14e9fa29e545 2021-04-20T14:45:22.7198260Z #9 extracting sha256:f123f733188d3ed3512c733a9c57c6b50a41ce614078c32d17f9e59656929689 2.3s done 2021-04-20T14:45:22.8689339Z #9 ... 2021-04-20T14:45:22.8689775Z 2021-04-20T14:45:22.8690210Z #8 [final 1/2] WORKDIR /app 2021-04-20T14:45:22.8691402Z #8 sha256:cbffd7fd0ae0083e38548dcfffb11e308f00a71d953b479b3e1ea38f8fc7582c 2021-04-20T14:45:22.8692506Z #8 DONE 0.0s 2021-04-20T14:45:22.8692780Z 2021-04-20T14:45:22.8693794Z #9 [build 1/7] FROM mcr.microsoft.com/dotnet/sdk:5.0#sha256:85ea9832ae26c70618418cf7c699186776ad066d88770fd6fd1edea9b260379a 2021-04-20T14:45:22.8695280Z #9 sha256:0d00a70370f81c372582259015e7fbdccd877b07d4409b05152d14e9fa29e545 2021-04-20T14:45:25.5718307Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 7.8s 2021-04-20T14:45:30.6765174Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 12.9s 2021-04-20T14:45:35.7813466Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 18.0s 2021-04-20T14:45:40.8869493Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 23.1s 2021-04-20T14:45:45.9920095Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 28.2s 2021-04-20T14:45:51.9547824Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 33.3s 2021-04-20T14:45:56.2024446Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 38.4s 2021-04-20T14:46:01.3075982Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 43.5s 2021-04-20T14:46:06.4127083Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 48.6s 2021-04-20T14:46:11.5173712Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 53.7s 2021-04-20T14:46:16.4693192Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 58.8s 2021-04-20T14:46:21.5692555Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 50.33MB / 101.24MB 63.9s 2021-04-20T14:46:25.9653724Z #9 67.99 error: failed to copy: read tcp 172.17.0.2:54176->131.253.33.219:443: read: connection reset by peer 2021-04-20T14:46:25.9655097Z #9 67.99 retrying in 1s 2021-04-20T14:46:27.1203732Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 61.87MB / 101.24MB 0.2s 2021-04-20T14:46:27.2705044Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 83.89MB / 101.24MB 0.3s 2021-04-20T14:46:27.4207278Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 95.42MB / 101.24MB 0.5s 2021-04-20T14:46:27.5707246Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 100.66MB / 101.24MB 0.6s 2021-04-20T14:46:27.8710117Z #9 sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 101.24MB / 101.24MB 1.0s done 2021-04-20T14:46:27.8711926Z #9 extracting sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 2021-04-20T14:46:30.4226611Z #9 extracting sha256:a8347b3a1feb3898ec8dfc50429da45e1e17c563a4a35157d02c56819f9fcac0 2.5s done 2021-04-20T14:46:30.4228882Z #9 extracting sha256:2d4575e6b4d4d83672f4a97ae481e068968b43e5bba3bc3e4816069609b05fdd 2021-04-20T14:46:30.8732258Z #9 extracting sha256:2d4575e6b4d4d83672f4a97ae481e068968b43e5bba3bc3e4816069609b05fdd 0.4s done 2021-04-20T14:46:30.8733432Z #9 DONE 73.1s 2021-04-20T14:46:30.8733741Z 2021-04-20T14:46:30.8734199Z #10 [build 2/7] WORKDIR /src 2021-04-20T14:46:30.8735339Z #10 sha256:43ac7aa94895d63bb0307f69eaa59d0917a25939cff1b1495a74ca45174bea91 2021-04-20T14:46:32.3729092Z #10 DONE 1.6s 2021-04-20T14:46:32.5229794Z 2021-04-20T14:46:32.5231299Z #12 [build 3/7] COPY [APITemperature/APITemperature.csproj, APITemperature/] 2021-04-20T14:46:32.5233695Z #12 sha256:d09810ae2ccefe4219b8df5128bdd020d0e845df4f233eeaed26900113316a0a 2021-04-20T14:46:32.5235328Z #12 DONE 0.0s 2021-04-20T14:46:32.5235686Z 2021-04-20T14:46:32.5236698Z #13 [build 4/7] RUN dotnet restore "APITemperature/APITemperature.csproj" 2021-04-20T14:46:32.5238617Z #13 sha256:49ec9cefad38e8249090ecba39417e2da86f6517f8810c31067ad63a5c7e6ff7 2021-04-20T14:46:32.5642823Z #13 0.149 A fatal error occurred, the folder [/usr/share/dotnet/host/fxr] does not contain any version-numbered child folders 2021-04-20T14:46:32.5660679Z #13 ERROR: executor failed running [/bin/sh -c dotnet restore "APITemperature/APITemperature.csproj"]: exit code: 131 2021-04-20T14:46:32.5662015Z ------ 2021-04-20T14:46:32.5662786Z > [build 4/7] RUN dotnet restore "APITemperature/APITemperature.csproj": 2021-04-20T14:46:32.5663632Z ------ 2021-04-20T14:46:32.5664082Z Dockerfile:13 2021-04-20T14:46:32.5664665Z -------------------- 2021-04-20T14:46:32.5665106Z 11 | WORKDIR /src 2021-04-20T14:46:32.5665924Z 12 | COPY ["APITemperature/APITemperature.csproj", "APITemperature/"] 2021-04-20T14:46:32.5667060Z 13 | >>> RUN dotnet restore "APITemperature/APITemperature.csproj" 2021-04-20T14:46:32.5667789Z 14 | COPY . . 2021-04-20T14:46:32.5668299Z 15 | WORKDIR "/src/APITemperature" 2021-04-20T14:46:32.5668978Z -------------------- 2021-04-20T14:46:32.5670386Z error: failed to solve: rpc error: code = Unknown desc = executor failed running [/bin/sh -c dotnet restore "APITemperature/APITemperature.csproj"]: exit code: 131 2021-04-20T14:46:32.5808114Z ##[error]buildx call failed with: error: failed to solve: rpc error: code = Unknown desc = executor failed running [/bin/sh -c dotnet restore "APITemperature/APITemperature.csproj"]: exit code: 131 2021-04-20T14:46:32.5937546Z Post job cleanup. 2021-04-20T14:46:32.6860546Z 🚿 Removing temp folder /tmp/docker-build-push-CcTisd 2021-04-20T14:46:32.6962378Z Post job cleanup. 2021-04-20T14:46:32.7472709Z [command]/usr/bin/docker logout 2021-04-20T14:46:32.8040667Z Removing login credentials for https://index.docker.io/v1/ 2021-04-20T14:46:32.8189690Z Post job cleanup. 2021-04-20T14:46:32.8819799Z [command]/usr/bin/docker buildx rm builder-c0cc5836-daba-4418-aa07-003804840d19 2021-04-20T14:46:33.4718542Z Post job cleanup. 2021-04-20T14:46:33.5839655Z [command]/usr/bin/git version 2021-04-20T14:46:33.5895531Z git version 2.31.1 2021-04-20T14:46:33.5957268Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand 2021-04-20T14:46:33.6001316Z [command]/usr/bin/git submodule foreach --recursive git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || : 2021-04-20T14:46:33.6273184Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader 2021-04-20T14:46:33.6298080Z http.https://github.com/.extraheader 2021-04-20T14:46:33.6308210Z [command]/usr/bin/git config --local --unset-all http.https://github.com/.extraheader 2021-04-20T14:46:33.6348266Z [command]/usr/bin/git submodule foreach --recursive git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || : 2021-04-20T14:46:33.6689104Z Cleaning up orphan processes http:// Thanks a lot for for your helps !
I have the same problem. My workaround is doing a normal build (not using buildx) instead of linux/arm/v7. Please read this thread https://github.com/dotnet/dotnet-docker/issues/1537#issuecomment-615269150 You can do some changes in your dockerfile. Build with a multi-platform tag FROM mcr.microsoft.com/dotnet/sdk:5.0 AS build Then publish with arm32v7 tag FROM mcr.microsoft.com/dotnet/aspnet:5.0-bullseye-slim-arm32v7 AS base