I Can't install python shapely package on Alpine-based docker image - docker

I try to build a custom docker image from nodered/node-red image (alpine-based). I use many ways and show many solution in google, but i still got error when installing shapely packege. My Dockerfile is:
FROM nodered/node-red
USER root
RUN apk update
RUN apk add py3-pip
RUN pip install pymap3d
RUN apk --update add build-base libxslt-dev
RUN apk add --virtual .build-deps \
--repository http://dl-cdn.alpinelinux.org/alpine/edge/testing \
--repository http://dl-cdn.alpinelinux.org/alpine/edge/main \
gcc libc-dev geos-dev geos && \
runDeps="$(scanelf --needed --nobanner --recursive /usr/local \
| awk '{ gsub(/,/, "\nso:", $2); print "so:" $2 }' \
| xargs -r apk info --installed \
| sort -u)" && \
apk add --virtual .rundeps $runDeps
RUN geos-config --cflags
#RUN pip install --disable-pip-version-check shapely
RUN apk del build-base python3-dev && \
rm -rf /var/cache/apk/*
RUN pip install shapely
I used any solution in internet but i can't fix error. I seen here and here and here and here and some other pages and my problem not solved.
Update:
docker build message is:
Step 10/10 : RUN pip install shapely
---> Running in a5f6d97702e7
Collecting shapely
Downloading shapely-2.0.0.tar.gz (274 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 274.5/274.5 kB 956.0 kB/s eta 0:00:00
Installing build dependencies: started
Installing build dependencies: finished with status 'error'
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> [301 lines of output]
Collecting Cython~=0.29
Downloading Cython-0.29.32-cp310-cp310-musllinux_1_1_x86_64.whl (2.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 1.5 MB/s eta 0:00:00
Collecting oldest-supported-numpy
Downloading oldest_supported_numpy-2022.11.19-py3-none-any.whl (4.9 kB)
Collecting setuptools>=61.0.0
Downloading setuptools-65.6.3-py3-none-any.whl (1.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 1.6 MB/s eta 0:00:00
Collecting numpy==1.21.6
Downloading numpy-1.21.6.zip (10.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.3/10.3 MB 1.6 MB/s eta 0:00:00
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Building wheels for collected packages: numpy
Building wheel for numpy (pyproject.toml): started
Building wheel for numpy (pyproject.toml): finished with status 'error'
error: subprocess-exited-with-error
× Building wheel for numpy (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [270 lines of output]
setup.py:63: RuntimeWarning: NumPy 1.21.6 may not yet support Python 3.10.
warnings.warn(
Running from numpy source directory.
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/tools/cythonize.py:69: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
from distutils.version import LooseVersion
Processing numpy/random/_bounded_integers.pxd.in
Processing numpy/random/mtrand.pyx
Processing numpy/random/_sfc64.pyx
Processing numpy/random/bit_generator.pyx
Processing numpy/random/_philox.pyx
Processing numpy/random/_mt19937.pyx
Processing numpy/random/_common.pyx
Processing numpy/random/_generator.pyx
Processing numpy/random/_pcg64.pyx
Processing numpy/random/_bounded_integers.pyx.in
Cythonizing sources
blas_opt_info:
blas_mkl_info:
customize UnixCCompiler
libraries mkl_rt not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
blis_info:
libraries blis not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
openblas_info:
libraries openblas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
accelerate_info:
NOT AVAILABLE
atlas_3_10_blas_threads_info:
Setting PTATLAS=ATLAS
libraries tatlas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_3_10_blas_info:
libraries satlas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_blas_threads_info:
Setting PTATLAS=ATLAS
libraries ptf77blas,ptcblas,atlas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_blas_info:
libraries f77blas,cblas,atlas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:2026: UserWarning:
Optimized (vendor) Blas libraries are not found.
Falls back to netlib Blas library which has worse performance.
A better performance should be easily gained by switching
Blas library.
if self._calc_info(blas):
blas_info:
libraries blas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:2026: UserWarning:
Blas (http://www.netlib.org/blas/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [blas]) or by setting
the BLAS environment variable.
if self._calc_info(blas):
blas_src_info:
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:2026: UserWarning:
Blas (http://www.netlib.org/blas/) sources not found.
Directories to search for the sources can be specified in the
numpy/distutils/site.cfg file (section [blas_src]) or by setting
the BLAS_SRC environment variable.
if self._calc_info(blas):
NOT AVAILABLE
non-existing path in 'numpy/distutils': 'site.cfg'
lapack_opt_info:
lapack_mkl_info:
libraries mkl_rt not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
openblas_lapack_info:
libraries openblas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
openblas_clapack_info:
libraries openblas,lapack not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
flame_info:
libraries flame not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_3_10_threads_info:
Setting PTATLAS=ATLAS
libraries lapack_atlas not found in /usr/local/lib
libraries tatlas,tatlas not found in /usr/local/lib
libraries lapack_atlas not found in /usr/lib
libraries tatlas,tatlas not found in /usr/lib
libraries lapack_atlas not found in /usr/lib/
libraries tatlas,tatlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_3_10_threads_info'>
NOT AVAILABLE
atlas_3_10_info:
libraries lapack_atlas not found in /usr/local/lib
libraries satlas,satlas not found in /usr/local/lib
libraries lapack_atlas not found in /usr/lib
libraries satlas,satlas not found in /usr/lib
libraries lapack_atlas not found in /usr/lib/
libraries satlas,satlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_3_10_info'>
NOT AVAILABLE
atlas_threads_info:
Setting PTATLAS=ATLAS
libraries lapack_atlas not found in /usr/local/lib
libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib
libraries lapack_atlas not found in /usr/lib
libraries ptf77blas,ptcblas,atlas not found in /usr/lib
libraries lapack_atlas not found in /usr/lib/
libraries ptf77blas,ptcblas,atlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_threads_info'>
NOT AVAILABLE
atlas_info:
libraries lapack_atlas not found in /usr/local/lib
libraries f77blas,cblas,atlas not found in /usr/local/lib
libraries lapack_atlas not found in /usr/lib
libraries f77blas,cblas,atlas not found in /usr/lib
libraries lapack_atlas not found in /usr/lib/
libraries f77blas,cblas,atlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_info'>
NOT AVAILABLE
lapack_info:
libraries lapack not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:1858: UserWarning:
Lapack (http://www.netlib.org/lapack/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [lapack]) or by setting
the LAPACK environment variable.
return getattr(self, '_calc_info_{}'.format(name))()
lapack_src_info:
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:1858: UserWarning:
Lapack (http://www.netlib.org/lapack/) sources not found.
Directories to search for the sources can be specified in the
numpy/distutils/site.cfg file (section [lapack_src]) or by setting
the LAPACK_SRC environment variable.
return getattr(self, '_calc_info_{}'.format(name))()
NOT AVAILABLE
numpy_linalg_lapack_lite:
FOUND:
language = c
define_macros = [('HAVE_BLAS_ILP64', None), ('BLAS_SYMBOL_SUFFIX', '64_')]
Warning: attempted relative import with no known parent package
/usr/lib/python3.10/distutils/dist.py:274: UserWarning: Unknown distribution option: 'define_macros'
warnings.warn(msg)
running bdist_wheel
running build
running config_cc
unifing config_cc, config, build_clib, build_ext, build commands --compiler options
running config_fc
unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options
running build_src
build_src
building py_modules sources
creating build
creating build/src.linux-x86_64-3.10
creating build/src.linux-x86_64-3.10/numpy
creating build/src.linux-x86_64-3.10/numpy/distutils
building library "npymath" sources
Could not locate executable gfortran
Could not locate executable f95
Could not locate executable ifort
Could not locate executable ifc
Could not locate executable lf95
Could not locate executable pgfortran
Could not locate executable nvfortran
Could not locate executable f90
Could not locate executable f77
Could not locate executable fort
Could not locate executable efort
Could not locate executable efc
Could not locate executable g77
Could not locate executable g95
Could not locate executable pathf95
Could not locate executable nagfor
Could not locate executable frt
don't know how to compile Fortran code on platform 'posix'
creating build/src.linux-x86_64-3.10/numpy/core
creating build/src.linux-x86_64-3.10/numpy/core/src
creating build/src.linux-x86_64-3.10/numpy/core/src/npymath
conv_template:> build/src.linux-x86_64-3.10/numpy/core/src/npymath/npy_math_internal.h
adding 'build/src.linux-x86_64-3.10/numpy/core/src/npymath' to include_dirs.
conv_template:> build/src.linux-x86_64-3.10/numpy/core/src/npymath/ieee754.c
conv_template:> build/src.linux-x86_64-3.10/numpy/core/src/npymath/npy_math_complex.c
None - nothing done with h_files = ['build/src.linux-x86_64-3.10/numpy/core/src/npymath/npy_math_internal.h']
building library "npyrandom" sources
building extension "numpy.core._multiarray_tests" sources
creating build/src.linux-x86_64-3.10/numpy/core/src/multiarray
conv_template:> build/src.linux-x86_64-3.10/numpy/core/src/multiarray/_multiarray_tests.c
building extension "numpy.core._multiarray_umath" sources
Traceback (most recent call last):
File "/tmp/tmp0bhr5u48_in_process.py", line 363, in <module>
main()
File "/tmp/tmp0bhr5u48_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/tmp/tmp0bhr5u48_in_process.py", line 261, in build_wheel
return _build_backend().build_wheel(wheel_directory, config_settings,
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 230, in build_wheel
return self._build_with_temp_dir(['bdist_wheel'], '.whl',
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 215, in _build_with_temp_dir
self.run_setup()
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 267, in run_setup
super(_BuildMetaLegacyBackend,
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 158, in run_setup
exec(compile(code, __file__, 'exec'), locals())
File "setup.py", line 448, in <module>
setup_package()
File "setup.py", line 440, in setup_package
setup(**metadata)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/core.py", line 169, in setup
return old_setup(**new_attr)
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/__init__.py", line 153, in setup
return distutils.core.setup(**attrs)
File "/usr/lib/python3.10/distutils/core.py", line 148, in setup
dist.run_commands()
File "/usr/lib/python3.10/distutils/dist.py", line 966, in run_commands
self.run_command(cmd)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/wheel/bdist_wheel.py", line 299, in run
self.run_command('build')
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build.py", line 61, in run
old_build.run(self)
File "/usr/lib/python3.10/distutils/command/build.py", line 135, in run
self.run_command(cmd_name)
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build_src.py", line 144, in run
self.build_sources()
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build_src.py", line 161, in build_sources
self.build_extension_sources(ext)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build_src.py", line 318, in build_extension_sources
sources = self.generate_sources(sources, ext)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build_src.py", line 378, in generate_sources
source = func(extension, build_dir)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/core/setup.py", line 434, in generate_config_h
moredefs, ignored = cocache.check_types(config_cmd, ext, build_dir)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/core/setup.py", line 44, in check_types
out = check_types(*a, **kw)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/core/setup.py", line 289, in check_types
raise SystemError(
SystemError: Cannot compile 'Python.h'. Perhaps you need to install python-dev|python-devel.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for numpy
Failed to build numpy
ERROR: Could not build wheels for numpy, which is required to install pyproject.toml-based projects
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.

From this: https://github.com/docker-library/python/issues/381#issuecomment-463880366 I added the suggestion in the Dockerfile and the build now succeeds.
New Dockerfile:
FROM nodered/node-red
USER root
RUN apk update
RUN apk add py3-pip
RUN pip install pymap3d
RUN apk --update add build-base libxslt-dev
RUN apk add --virtual .build-deps \
--repository http://dl-cdn.alpinelinux.org/alpine/edge/testing \
--repository http://dl-cdn.alpinelinux.org/alpine/edge/main \
gcc libc-dev geos-dev geos && \
runDeps="$(scanelf --needed --nobanner --recursive /usr/local \
| awk '{ gsub(/,/, "\nso:", $2); print "so:" $2 }' \
| xargs -r apk info --installed \
| sort -u)" && \
apk add --virtual .rundeps $runDeps
RUN geos-config --cflags
#RUN pip install --disable-pip-version-check shapely
#RUN apk del build-base python3-dev && \
# rm -rf /var/cache/apk/*
RUN apk add --no-cache python3-dev libstdc++ && \
apk add --no-cache g++ && \
ln -s /usr/include/locale.h /usr/include/xlocale.h && \
pip3 install numpy && \
pip3 install pandas
RUN pip install shapely
Build output:

Related

How to launch Jupyter with custom extension using a docker

I am trying to launch, via docker, a jupyterlab with a custom extension(extensionTest). However, I haven't been successful.
Can someone tell me how can i do it? Or put here an example?
What is the best base docker image to use?
Thank you
Best regards
I tried doing this dockerfile, but is not working :
FROM jupyter/minimal-notebook:lab-3.2.3
RUN pip install --no-cache-dir \
astropy \
ipytree \
ipywidgets \
jupyter \
numpy \
poliastro
RUN jupyter labextension install \
jupyterlab-plotly#4.14.2 \
plotlywidget#4.14.2
RUN pip install jupyterlab_widgets
COPY ./extensions/ ./extensions/
WORKDIR ./extensions/
RUN python -m pip install ./extensionTest
RUN jupyter labextension extensionTest
ENTRYPOINT start.sh jupyter lab
Thanks
Error that I get :
ERROR: Command errored out with exit status 1:
command: /opt/conda/bin/python /opt/conda/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmp2m0biqur
cwd: /home/jovyan/extensions/extensionTest
Complete output (44 lines):
INFO:hatch_jupyter_builder.utils:Running jupyter-builder
INFO:hatch_jupyter_builder.utils:Building with hatch_jupyter_builder.npm_builder
INFO:hatch_jupyter_builder.utils:With kwargs: {'build_cmd': 'build:prod', 'npm': ['jlpm']}
INFO:hatch_jupyter_builder.utils:Installing build dependencies with npm. This may take a while...
INFO:hatch_jupyter_builder.utils:> /tmp/pip-build-env-gj35ixm0/overlay/bin/jlpm install
yarn install v1.21.1
info No lockfile found.
[1/4] Resolving packages...
warning #jupyterlab/application > #jupyterlab/apputils > url > querystring#0.2.0: The querystring API is considered Legacy. new code should use the URLSearchParams API instead.
warning #jupyterlab/application > #jupyterlab/ui-components > #blueprintjs/core > popper.js#1.16.1: You can find the new Popper v2 at #popperjs/core, this package is dedicated to the legacy v1
warning #jupyterlab/application > #jupyterlab/ui-components > #blueprintjs/core > react-popper > popper.js#1.16.1: You can find the new Popper v2 at #popperjs/core, this package is dedicated to the legacy v1
warning #jupyterlab/builder > terser-webpack-plugin > cacache > #npmcli/move-file#1.1.2: This functionality has been moved to #npmcli/fs
warning #jupyterlab/builder > #jupyterlab/buildutils > crypto#1.0.1: This package is no longer supported. It's now a built-in Node module. If you've depended on crypto, you should switch to the one that's built-in.
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request#2.88.0: request has been deprecated, see https://github.com/request/request/issues/3142
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request > har-validator#5.1.5: this library is no longer supported
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request > uuid#3.4.0: Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details.
[2/4] Fetching packages...
warning #blueprintjs/core#3.54.0: Invalid bin entry for "upgrade-blueprint-2.0.0-rename" (in "#blueprintjs/core").
warning #blueprintjs/core#3.54.0: Invalid bin entry for "upgrade-blueprint-3.0.0-rename" (in "#blueprintjs/core").
error lib0#0.2.58: The engine "node" is incompatible with this module. Expected version ">=14". Got "12.4.0"
error Found incompatible module.
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
Traceback (most recent call last):
File "/opt/conda/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
main()
File "/opt/conda/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/opt/conda/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 261, in build_wheel
return _build_backend().build_wheel(wheel_directory, config_settings,
File "/tmp/pip-build-env-gj35ixm0/overlay/lib/python3.9/site-packages/hatchling/build.py", line 41, in build_wheel
return os.path.basename(next(builder.build(wheel_directory, ['standard'])))
File "/tmp/pip-build-env-gj35ixm0/overlay/lib/python3.9/site-packages/hatchling/builders/plugin/interface.py", line 136, in build
build_hook.initialize(version, build_data)
File "/tmp/pip-build-env-gj35ixm0/normal/lib/python3.9/site-packages/hatch_jupyter_builder/plugin.py", line 83, in initialize
raise e
File "/tmp/pip-build-env-gj35ixm0/normal/lib/python3.9/site-packages/hatch_jupyter_builder/plugin.py", line 78, in initialize
build_func(self.target_name, version, **build_kwargs)
File "/tmp/pip-build-env-gj35ixm0/normal/lib/python3.9/site-packages/hatch_jupyter_builder/utils.py", line 114, in npm_builder
run(npm_cmd + ["install"], cwd=str(abs_path))
File "/tmp/pip-build-env-gj35ixm0/normal/lib/python3.9/site-packages/hatch_jupyter_builder/utils.py", line 227, in run
return subprocess.check_call(cmd, **kwargs)
File "/opt/conda/lib/python3.9/subprocess.py", line 373, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/tmp/pip-build-env-gj35ixm0/overlay/bin/jlpm', 'install']' returned non-zero exit status 1.
So, to anyone interested i was able to create a docker file that builds and run you jupyter extension. My code is this:
FROM node:14 AS build-env
RUN apt-get update && \
apt-get install -y python3-pip && \
pip3 install jupyterlab
COPY Path/to/Extension Path/to/Extension
WORKDIR Path/to/extension
RUN yarn install && yarn build && yarn run build
FROM jupyter/minimal-notebook:lab-3.2.3
RUN pip install --no-cache-dir \
astropy \
ipytree \
ipywidgets \
jupyter \
numpy \
poliastro
RUN jupyter labextension install \
jupyterlab-plotly#4.14.2 \
#jupyter-widgets/jupyterlab-manager \
plotlywidget#4.14.2
COPY --from=build-env Path/to/Extension/on/Nodejs Path/to/Extension
RUN jupyter labextension install Path/to/Extension
If someone knows something that can be simplified, let me know :)

Im having problems while running alphacode on ubuntu

I am using Docker Ubuntu.
I have installed the full dataset(dm-code_contests) to /tmp folder and cloned the git repository on /home folder(the repository is code_contests). When I try to run bazel run -c opt \ :print_names_and_sources /tmp/dm-code_contests/code_contests_valid.riegeli(in /home/code_contests folder), it shows error:
Starting local Bazel server and connecting to it...
INFO: Repository local_config_python instantiated at:
/home/code_contests/WORKSPACE:12:10: in <toplevel>
/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/bazel/grpc_deps.bzl:414:21: in grpc_deps
/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/bazel/grpc_python_deps.bzl:43:21: in grpc_python_deps
Repository rule python_configure defined at:
/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl:365:35: in <toplevel>
ERROR: An error occurred during the fetch of repository 'local_config_python':
Traceback (most recent call last):
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 355, column 35, in _python_autoconf_impl
_create_single_version_package(
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 304, column 45, in _create_single_version_package
python_include = _get_python_include(repository_ctx, python_bin)
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 236, column 22, in _get_python_include
result = _execute(
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 62, column 14, in _execute
_fail("\n".join([
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 35, column 9, in _fail
fail("%sPython Configuration Error:%s %s\n" % (red, no_color, msg))
Error in fail: Python Configuration Error: Problem getting python include path for /usr/bin/python3.
<string>:1: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
<string>:1: DeprecationWarning: The distutils.sysconfig module is deprecated, use sysconfig instead
Is the Python binary path set up right? (See ./configure or /usr/bin/python3.) Is distutils installed? Are Python headers installed? Try installing python-dev or python3-dev on Debian-based systems. Try python-devel or python3-devel on Redhat-based systems.
ERROR: /home/code_contests/WORKSPACE:12:10: fetching python_configure rule //external:local_config_python: Traceback (most recent call last):
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 355, column 35, in _python_autoconf_impl
_create_single_version_package(
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 304, column 45, in _create_single_version_package
python_include = _get_python_include(repository_ctx, python_bin)
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 236, column 22, in _get_python_include
result = _execute(
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 62, column 14, in _execute
_fail("\n".join([
File "/root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_github_grpc_grpc/third_party/py/python_configure.bzl", line 35, column 9, in _fail
fail("%sPython Configuration Error:%s %s\n" % (red, no_color, msg))
Error in fail: Python Configuration Error: Problem getting python include path for /usr/bin/python3.
<string>:1: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
<string>:1: DeprecationWarning: The distutils.sysconfig module is deprecated, use sysconfig instead
Is the Python binary path set up right? (See ./configure or /usr/bin/python3.) Is distutils installed? Are Python headers installed? Try installing python-dev or python3-dev on Debian-based systems. Try python-devel or python3-devel on Redhat-based systems.
ERROR: /root/.cache/bazel/_bazel_root/24a36d3f089e715b642fd688d4461183/external/com_google_riegeli/python/riegeli/records/BUILD:8:13: #com_google_riegeli//python/riegeli/records:record_writer_cc depends on #local_config_python//:python_headers in repository #local_config_python which failed to fetch. no such package '#local_config_python//': Python Configuration Error: Problem getting python include path for /usr/bin/python3.
<string>:1: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
<string>:1: DeprecationWarning: The distutils.sysconfig module is deprecated, use sysconfig instead
Is the Python binary path set up right? (See ./configure or /usr/bin/python3.) Is distutils installed? Are Python headers installed? Try installing python-dev or python3-dev on Debian-based systems. Try python-devel or python3-devel on Redhat-based systems.
ERROR: Analysis of target '//:print_names_and_sources' failed; build aborted:
INFO: Elapsed time: 3.701s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (49 packages loaded, 348 targets configured)
FAILED: Build did NOT complete successfully (49 packages loaded, 348 targets configured)
Fetching #com_google_absl; Cloning tags/20211102.0 of https://github.com/abseil/abseil-cpp.git
root#c89a94de94ce://home/code_contests# bazel run -c opt \ :print_names_and_sources /tmp/dm-code_contests/code_contests
_valid.riegeli
ERROR: Skipping ' :print_names_and_sources': no such package ' ': BUILD file not found in any of the following directories. Add a BUILD file to a directory to mark it as a package.
- /home/code_contests/
WARNING: Target pattern parsing failed.
ERROR: no such package ' ': BUILD file not found in any of the following directories. Add a BUILD file to a directory to mark it as a package.
- /home/code_contests/
INFO: Elapsed time: 0.174s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
FAILED: Build did NOT complete successfully (0 packages loaded)
Im new to Ubuntu(as well as bazel). So how can I fix this error and run the project?
link to the source code: https://github.com/deepmind/code_contests
You should make sure your gcc is the latest version.
For python2 you should install it like this:
#Remove bazel and reinstall
bazel clean --expunge
rm -rf ~/.cache/bazel
To re-install follow this instruction
#Install python2 dependency
sudo apt update && sudo apt install python-dev
For a detailed explanation kindly refer to this document. Thank you!

Docker failed with [Makefile:192: imagick_file.lo] Error 127

I try to install php in docker of w2orking project with header in Dockerfile :
FROM composer:1 AS composer
FROM php:7.4-fpm-alpine
COPY --from=composer /usr/bin/composer /usr/bin/composer
ENV PHPIZE_DEPS \
build-base \
...
I got errors:
checking if cc PIC flag -fPIC works... yes
checking if cc static flag -static works... yes
checking if cc supports -c -o file.o... yes
checking whether the cc linker (/usr/x86_64-alpine-linux-musl/bin/ld -m elf_x86_64) supports shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... no
creating libtool
appending configuration tag "CXX" to libtool
configure: patching config.h.in
configure: creating ./config.status
config.status: creating config.h
running: make
/bin/sh /tmp/pear/temp/pear-build-defaultuserOFhDoe/imagick-3.4.3/libtool --mode=compile cc -I/usr/include/ImageMagick-7 -DMAGICKCORE_HDRI_ENABLE=1 -DMAGICKCORE_QUANTUM_DEPTH=16 -I. -I/tmp/pear/temp/imagick -DPHP_ATOM_INC -I/tmp/pear/temp/pear-build-defaultuserOFhDoe/imagick-3.4.3/include -I/tmp/pear/temp/pear-build-defaultuserOFhDoe/imagick-3.4.3/main -I/tmp/pear/temp/imagick -I/usr/local/include/php -I/usr/local/include/php/main -I/usr/local/include/php/TSRM -I/usr/local/include/php/Zend -I/usr/local/include/php/ext -I/usr/local/include/php/ext/date/lib -I/usr/include/ImageMagick-7 -DHAVE_CONFIG_H -g -O2 -c /tmp/pear/temp/imagick/imagick_file.c -o imagick_file.lo
make: /bin/sh: Operation not permitted
make: *** [Makefile:192: imagick_file.lo] Error 127
ERROR: `make' failed
ERROR: Service 'backapp' failed to build: The command '/bin/sh -c set -xe && apk add --no-cache ${PERMANENT_DEPS} && apk add --no-cache --virtual .build-deps ${PHPIZE_DEPS} && apk add --no-cache --repository http://dl-3.alpinelinux.org/alpine/edge/community gnu-libiconv && pecl install imagick-3.4.3 && docker-php-ext-enable imagick && docker-php-ext-configure pdo_mysql && docker-php-ext-configure bcmath --enable-bcmath && docker-php-ext-configure pcntl --enable-pcntl && docker-php-ext-configure intl --enable-intl && docker-php-ext-configure sysvmsg && docker-php-ext-configure sysvsem && docker-php-ext-configure sysvshm && docker-php-ext-install -j$(nproc) pdo_mysql sockets gettext bcmath pcntl intl sysvmsg sysvsem sysvshm && apk del .build-deps' returned a non-zero code: 1
Searching in net for decision I check cc/gcc in my system :
ProjectName$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-linux-gnu/9/lto-wrapper
OFFLOAD_TARGET_NAMES=nvptx-none:hsa
OFFLOAD_TARGET_DEFAULT=1
Target: x86_64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 9.3.0-17ubuntu1~20.04' --with-bugurl=file:///usr/share/doc/gcc-9/README.Bugs --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --prefix=/usr --with-gcc-major-version-only --program-suffix=-9 --program-prefix=x86_64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-vtable-verify --enable-plugin --enable-default-pie --with-system-zlib --with-target-system-zlib=auto --enable-objc-gc=auto --enable-multiarch --disable-werror --with-arch-32=i686 --with-abi=m64 --with-multilib-list=m32,m64,mx32 --enable-multilib --with-tune=generic --enable-offload-targets=nvptx-none=/build/gcc-9-HskZEa/gcc-9-9.3.0/debian/tmp-nvptx/usr,hsa --without-cuda-driver --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=x86_64-linux-gnu
Thread model: posix
gcc version 9.3.0 (Ubuntu 9.3.0-17ubuntu1~20.04)
ProjectName$ cc -v
Using built-in specs.
COLLECT_GCC=cc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-linux-gnu/9/lto-wrapper
OFFLOAD_TARGET_NAMES=nvptx-none:hsa
OFFLOAD_TARGET_DEFAULT=1
Target: x86_64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 9.3.0-17ubuntu1~20.04' --with-bugurl=file:///usr/share/doc/gcc-9/README.Bugs --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --prefix=/usr --with-gcc-major-version-only --program-suffix=-9 --program-prefix=x86_64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-vtable-verify --enable-plugin --enable-default-pie --with-system-zlib --with-target-system-zlib=auto --enable-objc-gc=auto --enable-multiarch --disable-werror --with-arch-32=i686 --with-abi=m64 --with-multilib-list=m32,m64,mx32 --enable-multilib --with-tune=generic --enable-offload-targets=nvptx-none=/build/gcc-9-HskZEa/gcc-9-9.3.0/debian/tmp-nvptx/usr,hsa --without-cuda-driver --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=x86_64-linux-gnu
Thread model: posix
ProjectName$ uname -a
Linux master-laptop 5.11.0-37-generic #41~20.04.2-Ubuntu SMP Fri Sep 24 09:06:38 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
ProjectName$ composer -v
______
Composer version 2.1.8 2021-09-15 13:55:14
Also I installed build-essential, but checking its info I found error in its output. Could it be the issue, but how to fix bit ?
$ apt show build-essential — info -a
Package: build-essential
Version: 12.8ubuntu1.1
Priority: optional
Build-Essential: yes
Section: devel
Origin: Ubuntu
Maintainer: Ubuntu Developers <ubuntu-devel-discuss#lists.ubuntu.com>
Original-Maintainer: Matthias Klose <doko#debian.org>
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Installed-Size: 21,5 kB
Depends: libc6-dev | libc-dev, gcc (>= 4:9.2), g++ (>= 4:9.2), make, dpkg-dev (>= 1.17.11)
Task: ubuntu-mate-core, ubuntu-mate-desktop
Download-Size: 4 664 B
APT-Manual-Installed: yes
APT-Sources: http://ua.archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages
Description: Informational list of build-essential packages
If you do not plan to build Debian packages, you don't need this
package. Starting with dpkg (>= 1.14.18) this package is required
for building Debian packages.
.
This package contains an informational list of packages which are
considered essential for building Debian packages. This package also
depends on the packages on that list, to make it easy to have the
build-essential packages installed.
.
If you have this package installed, you only need to install whatever
a package specifies as its build-time dependencies to build the
package. Conversely, if you are determining what your package needs
to build-depend on, you can always leave out the packages this
package depends on.
.
This package is NOT the definition of what packages are
build-essential; the real definition is in the Debian Policy Manual.
This package contains merely an informational list, which is all
most people need. However, if this package and the manual disagree,
the manual is correct.
Package: build-essential
Version: 12.8ubuntu1
Priority: optional
Build-Essential: yes
Section: devel
Origin: Ubuntu
Maintainer: Ubuntu Developers <ubuntu-devel-discuss#lists.ubuntu.com>
Original-Maintainer: Matthias Klose <doko#debian.org>
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Installed-Size: 20,5 kB
Depends: libc6-dev | libc-dev, gcc (>= 4:9.2), g++ (>= 4:9.2), make, dpkg-dev (>= 1.17.11)
Task: ubuntu-mate-core, ubuntu-mate-desktop
Download-Size: 4 624 B
APT-Sources: http://ua.archive.ubuntu.com/ubuntu focal/main amd64 Packages
Description: Informational list of build-essential packages
If you do not plan to build Debian packages, you don't need this
package. Starting with dpkg (>= 1.14.18) this package is required
for building Debian packages.
.
This package contains an informational list of packages which are
considered essential for building Debian packages. This package also
depends on the packages on that list, to make it easy to have the
build-essential packages installed.
.
If you have this package installed, you only need to install whatever
a package specifies as its build-time dependencies to build the
package. Conversely, if you are determining what your package needs
to build-depend on, you can always leave out the packages this
package depends on.
.
This package is NOT the definition of what packages are
build-essential; the real definition is in the Debian Policy Manual.
This package contains merely an informational list, which is all
most people need. However, if this package and the manual disagree,
the manual is correct.
Package: info
Version: 6.7.0.dfsg.2-5
Priority: standard
Section: doc
Source: texinfo
Origin: Ubuntu
Maintainer: Ubuntu Developers <ubuntu-devel-discuss#lists.ubuntu.com>
Original-Maintainer: Debian TeX maintainers <debian-tex-maint#lists.debian.org>
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Installed-Size: 831 kB
Provides: info-browser
Depends: libc6 (>= 2.15), libtinfo6 (>= 6), install-info
Breaks: texinfo-doc-nonfree
Replaces: texinfo (<< 4.7-2), texinfo-doc-nonfree
Homepage: https://www.gnu.org/software/texinfo/
Task: standard
Download-Size: 203 kB
APT-Manual-Installed: no
APT-Sources: http://ua.archive.ubuntu.com/ubuntu focal/main amd64 Packages
Description: Standalone GNU Info documentation browser
The Info file format is an easily-parsable representation for online
documents. This program allows you to view Info documents, like the
ones stored in /usr/share/info.
.
Much of the software in Debian comes with its online documentation in
the form of Info files, so it is most likely you will want to install it.
N: Unable to locate package —
How to fix this error ?
Modified BLOCK # 1:
Searching in net I found hints that reason of this error can be that some apps are not in $PATH
So I added line with build-essential path :
export PATH="/usr/share/build-essential:$PATH"
in file /home/master/.bashrc and run update command :
source ~/.bashrc
After that I check that all related apops in PATH :
master#master-laptop:ProjectName$ $PATH
bash: /usr/share/build-essential:/home/master/.composer/vendor/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin: No such file or directory
master#master-laptop:ProjectName$ whereis gcc
gcc: /usr/bin/gcc /usr/lib/gcc /usr/share/gcc /usr/share/man/man1/gcc.1.gz
master#master-laptop:ProjectName$ whereis cc
cc: /usr/bin/cc /usr/share/man/man1/cc.1.gz
master#master-laptop:ProjectName$ whereis build-essential
build-essential: /usr/share/build-essential
But anyway I got the same error :
...
make: /bin/sh: Operation not permitted
make: *** [Makefile:192: imagick_file.lo] Error 127
Thanks!
It's difficult to provide an accurate answer without the whole Dockerfile, but I will try to point your problem.
Well, first of all you're using a multi-staged build, but you're not taking advantage from it. The whole point about the composer image isn't just the Composer PHAR, but that image already contains all the dependencies require to build your PHP dependencies.
Meanwhile you have something like this:
FROM composer:1 as composer
FROM php:7.4-fpm-alpine
ENV ...
[BUILD_PROCESS]
[EXTRA_PROCESS]
You should have something like this:
FROM composer:1 as vendor
WORKDIR /app
COPY database/ database/
COPY composer.json composer.json
COPY composer.lock composer.lock
[BUILD_PROCESS]
FROM php:7.4-fpm-alpine
WORKDIR /app
COPY --from=vendor /app/vendor/ ./vendor/
COPY . .
[EXTRA_PROCESS]
Basically: delegate the build of the dependencies to the Composer image and use your lightweight PHP Alpine just for the application.
Using multiple FROM clauses creates multiple images
I agree with #Daniel Campos however I will suggest looking at the resultant images. You should file multiple.
Have a look at this thread which explains some more details.
Multiple FROMs - what it means

`pip install cyrptography ` not able to install on python3.7-alpine docker

I'm trying to install crypotography using the below dockerfile for alpine3.7 python base.
However, I'm getting error like this:
#7 17.27 Failed to build cryptography cffi
#7 17.37 Installing collected packages: pycparser, six, idna, cffi, asn1crypto, pymysql, cryptography
#7 17.68 Running setup.py install for cffi: started
#7 18.04 Running setup.py install for cffi: finished with status 'error'
#7 18.04 ERROR: Command errored out with exit status 1:
#7 18.04 command: /usr/local/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-twxf8fj_/cffi_1f087e9b9e8c4af8b4e6bf3deb3a5bf9/setup.py'"'"'; __file__='"'"'/tmp/pip-install-twxf8fj_/cffi_1f087e9b9e8c4af8b4e6bf3deb3a5bf9/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-j8mkceyt/install-record.txt --single-version-externally-managed --compile --install-headers /usr/local/include/python3.7m/cffi
#7 18.04 cwd: /tmp/pip-install-twxf8fj_/cffi_1f087e9b9e8c4af8b4e6bf3deb3a5bf9/
#7 18.04 Complete output (48 lines):
#7 18.04 unable to execute 'gcc': No such file or directory
#7 18.04 unable to execute 'gcc': No such file or directory
As per cryptography document I'm installing all necessary packages but still it's throwing me the same error.
Dockerfile:
FROM python:3.7-alpine
WORKDIR /project
RUN apk add --no-cache --virtual gcc musl-dev python3-dev libffi-dev openssl-dev cargo mariadb-dev
RUN pip install pymysql cryptography
Can you anyone point me what is i'm doing wrong?
-t, --virtual NAME Instead of adding all the packages to 'world', create a new
virtual package with the listed dependencies and add that
to 'world'; the actions of the command are easily reverted
by deleting the virtual package
What that means is when you install packages, those packages are not added to global packages. And this change can be easily reverted. So if I need gcc to compile a program, but once the program is compiled I no more need gcc.
I can install gcc, and other required packages in a virtual package and all of its dependencies and everything can be removed this virtual package name. Below is an example usage:
apk add --virtual .dep gcc
apk del .dep
So, for you, in apk add --no-cache --virtual gcc, the gcc was wrongly treat as virtual package name, which means you in fact did not install gcc.
To fix this, try to add a virtual package name before all packages as next:
/ # apk add --no-cache --virtual .dep gcc
fetch https://mirrors.ustc.edu.cn/alpine/v3.14/main/x86_64/APKINDEX.tar.gz
fetch https://mirrors.ustc.edu.cn/alpine/v3.14/community/x86_64/APKINDEX.tar.gz
(1/12) Installing libgcc (10.3.1_git20210424-r2)
(2/12) Installing libstdc++ (10.3.1_git20210424-r2)
(3/12) Installing binutils (2.35.2-r2)
(4/12) Installing libgomp (10.3.1_git20210424-r2)
(5/12) Installing libatomic (10.3.1_git20210424-r2)
(6/12) Installing libgphobos (10.3.1_git20210424-r2)
(7/12) Installing gmp (6.2.1-r0)
(8/12) Installing isl22 (0.22-r0)
(9/12) Installing mpfr4 (4.1.0-r0)
(10/12) Installing mpc1 (1.2.1-r0)
(11/12) Installing gcc (10.3.1_git20210424-r2)
(12/12) Installing .dep (20210629.140945)
Executing busybox-1.33.1-r2.trigger
OK: 122 MiB in 47 packages
/ # gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/lto-wrapper
Target: x86_64-alpine-linux-musl
Configured with: /home/buildozer/aports/main/gcc/src/gcc-10.3.1_git20210424/configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --build=x86_64-alpine-linux-musl --host=x86_64-alpine-linux-musl --target=x86_64-alpine-linux-musl --with-pkgversion='Alpine 10.3.1_git20210424' --enable-checking=release --disable-fixed-point --disable-libstdcxx-pch --disable-multilib --disable-nls --disable-werror --disable-symvers --enable-__cxa_atexit --enable-default-pie --enable-default-ssp --enable-cloog-backend --enable-languages=c,c++,d,objc,go,fortran,ada --disable-libssp --disable-libmpx --disable-libmudflap --disable-libsanitizer --enable-shared --enable-threads --enable-tls --with-system-zlib --with-linker-hash-style=gnu
Thread model: posix
Supported LTO compression algorithms: zlib
gcc version 10.3.1 20210424 (Alpine 10.3.1_git20210424)
Or directly as document said, remove --virtual:
apk add gcc

Error during Preparing Wheel Metadata when installing Scipy on Alpine Docker

I'm new to Docker, and have been trying to wrap a python script in a container, but I'm getting an error during the installation of scipy 0.17.0 (as a dependency for scikit-learn) on my docked alpine instance that I've not been able to find an answer to.
My Dockerfile:
FROM python:alpine
COPY . /app
WORKDIR /app
RUN apk add --no-cache python3-dev libstdc++ && \
apk add --no-cache g++ && \
ln -s /usr/include/locale.h /usr/include/xlocale.h
RUN pip3 install -r requirements.txt
CMD python ./python_script.py
My requirements.txt file:
numpy==1.16.5
pandas==0.25.2
scikit-learn==0.21.3
Output:
Sending build context to Docker daemon 1.097GB
Step 1/7 : FROM python:alpine
---> 204216b3821e
Step 2/7 : COPY . /app
---> 1e06520a2b68
Step 3/7 : WORKDIR /app
---> Running in 3ca38d1b57ad
Removing intermediate container 3ca38d1b57ad
---> 7594bc07da39
Step 4/7 : RUN apk add --no-cache python3-dev libstdc++ && apk add --no-cache g++ && ln -s /usr/include/locale.h /usr/include/xlocale.h
---> Running in 58ec019d5e3a
fetch http://dl-cdn.alpinelinux.org/alpine/v3.10/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.10/community/x86_64/APKINDEX.tar.gz
(1/5) Installing libgcc (8.3.0-r0)
(2/5) Installing libstdc++ (8.3.0-r0)
(3/5) Installing pkgconf (1.6.1-r1)
(4/5) Installing python3 (3.7.5-r1)
(5/5) Installing python3-dev (3.7.5-r1)
Executing busybox-1.30.1-r2.trigger
OK: 119 MiB in 40 packages
fetch http://dl-cdn.alpinelinux.org/alpine/v3.10/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.10/community/x86_64/APKINDEX.tar.gz
(1/11) Installing binutils (2.32-r0)
(2/11) Installing gmp (6.1.2-r1)
(3/11) Installing isl (0.18-r0)
(4/11) Installing libgomp (8.3.0-r0)
(5/11) Installing libatomic (8.3.0-r0)
(6/11) Installing mpfr3 (3.1.5-r1)
(7/11) Installing mpc1 (1.1.0-r0)
(8/11) Installing gcc (8.3.0-r0)
(9/11) Installing musl-dev (1.1.22-r3)
(10/11) Installing libc-dev (0.7.1-r0)
(11/11) Installing g++ (8.3.0-r0)
Executing busybox-1.30.1-r2.trigger
OK: 270 MiB in 51 packages
Removing intermediate container 58ec019d5e3a
---> 9b5f6907e101
Step 5/7 : RUN pip3 install -r requirements.txt
---> Running in 44672f5d6b33
Collecting numpy==1.16.5
Downloading https://files.pythonhosted.org/packages/db/ec/93ddd4696e9cce0ffb8429516a8ba0d0ee95911cbbadde2d23665b62ad39/numpy-1.16.5.zip (5.1MB)
Collecting pandas==0.25.2
Downloading https://files.pythonhosted.org/packages/42/cb/e3b69df7d3e6095a5e86fbe930e57f3f0a440fb73f350ab253efe2c7b924/pandas-0.25.2.tar.gz (12.6MB)
Collecting scikit-learn==0.21.3
Downloading https://files.pythonhosted.org/packages/1e/ce/9d8c88e68af0a5b5c5d78d8d2b7bcadfd45e1d6afc863ccb9aee30765b06/scikit-learn-0.21.3.tar.gz (12.2MB)
Collecting python-dateutil>=2.6.1
Downloading https://files.pythonhosted.org/packages/d4/70/d60450c3dd48ef87586924207ae8907090de0b306af2bce5d134d78615cb/python_dateutil-2.8.1-py2.py3-none-any.whl (227kB)
Collecting pytz>=2017.2
Downloading https://files.pythonhosted.org/packages/e7/f9/f0b53f88060247251bf481fa6ea62cd0d25bf1b11a87888e53ce5b7c8ad2/pytz-2019.3-py2.py3-none-any.whl (509kB)
Collecting scipy>=0.17.0
Downloading https://files.pythonhosted.org/packages/df/20/2f147945396fa74b467d696e42aa55eb603d166490a8ebe4d79e54c793df/scipy-1.3.2.tar.gz (23.6MB)
Installing build dependencies: started
Installing build dependencies: still running...
Installing build dependencies: still running...
Installing build dependencies: still running...
Installing build dependencies: still running...
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing wheel metadata: started
Preparing wheel metadata: finished with status 'error'
ERROR: Command errored out with exit status 1:
command: /usr/local/bin/python /usr/local/lib/python3.8/site-packages/pip/_vendor/pep517/_in_process.py prepare_metadata_for_build_wheel /tmp/tmp6r_mhwge
cwd: /tmp/pip-install-xqdy89zg/scipy
Complete output (139 lines):
lapack_opt_info:
lapack_mkl_info:
customize UnixCCompiler
libraries mkl_rt not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
openblas_lapack_info:
customize UnixCCompiler
customize UnixCCompiler
libraries openblas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
openblas_clapack_info:
customize UnixCCompiler
customize UnixCCompiler
libraries openblas,lapack not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
flame_info:
customize UnixCCompiler
libraries flame not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_3_10_threads_info:
Setting PTATLAS=ATLAS
customize UnixCCompiler
libraries lapack_atlas not found in /usr/local/lib
customize UnixCCompiler
libraries tatlas,tatlas not found in /usr/local/lib
customize UnixCCompiler
libraries lapack_atlas not found in /usr/lib
customize UnixCCompiler
libraries tatlas,tatlas not found in /usr/lib
customize UnixCCompiler
libraries lapack_atlas not found in /usr/lib/
customize UnixCCompiler
libraries tatlas,tatlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_3_10_threads_info'>
NOT AVAILABLE
atlas_3_10_info:
customize UnixCCompiler
libraries lapack_atlas not found in /usr/local/lib
customize UnixCCompiler
libraries satlas,satlas not found in /usr/local/lib
customize UnixCCompiler
libraries lapack_atlas not found in /usr/lib
customize UnixCCompiler
libraries satlas,satlas not found in /usr/lib
customize UnixCCompiler
libraries lapack_atlas not found in /usr/lib/
customize UnixCCompiler
libraries satlas,satlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_3_10_info'>
NOT AVAILABLE
atlas_threads_info:
Setting PTATLAS=ATLAS
customize UnixCCompiler
libraries lapack_atlas not found in /usr/local/lib
customize UnixCCompiler
libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib
customize UnixCCompiler
libraries lapack_atlas not found in /usr/lib
customize UnixCCompiler
libraries ptf77blas,ptcblas,atlas not found in /usr/lib
customize UnixCCompiler
libraries lapack_atlas not found in /usr/lib/
customize UnixCCompiler
libraries ptf77blas,ptcblas,atlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_threads_info'>
NOT AVAILABLE
atlas_info:
customize UnixCCompiler
libraries lapack_atlas not found in /usr/local/lib
customize UnixCCompiler
libraries f77blas,cblas,atlas not found in /usr/local/lib
customize UnixCCompiler
libraries lapack_atlas not found in /usr/lib
customize UnixCCompiler
libraries f77blas,cblas,atlas not found in /usr/lib
customize UnixCCompiler
libraries lapack_atlas not found in /usr/lib/
customize UnixCCompiler
libraries f77blas,cblas,atlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_info'>
NOT AVAILABLE
accelerate_info:
NOT AVAILABLE
lapack_info:
customize UnixCCompiler
libraries lapack not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
lapack_src_info:
NOT AVAILABLE
NOT AVAILABLE
setup.py:386: UserWarning: Unrecognized setuptools command ('dist_info --egg-base /tmp/pip-modern-metadata-cr892mzk'), proceeding with generating Cython sources and expanding templates
warnings.warn("Unrecognized setuptools command ('{}'), proceeding with "
Running from scipy source directory.
/tmp/pip-build-env-8rlzds3x/overlay/lib/python3.8/site-packages/numpy/distutils/system_info.py:1712: UserWarning:
Lapack (http://www.netlib.org/lapack/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [lapack]) or by setting
the LAPACK environment variable.
if getattr(self, '_calc_info_{}'.format(lapack))():
/tmp/pip-build-env-8rlzds3x/overlay/lib/python3.8/site-packages/numpy/distutils/system_info.py:1712: UserWarning:
Lapack (http://www.netlib.org/lapack/) sources not found.
Directories to search for the sources can be specified in the
numpy/distutils/site.cfg file (section [lapack_src]) or by setting
the LAPACK_SRC environment variable.
if getattr(self, '_calc_info_{}'.format(lapack))():
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/pip/_vendor/pep517/_in_process.py", line 257, in <module>
main()
File "/usr/local/lib/python3.8/site-packages/pip/_vendor/pep517/_in_process.py", line 240, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/usr/local/lib/python3.8/site-packages/pip/_vendor/pep517/_in_process.py", line 110, in prepare_metadata_for_build_wheel
return hook(metadata_directory, config_settings)
File "/tmp/pip-build-env-8rlzds3x/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 156, in prepare_metadata_for_build_wheel
self.run_setup()
File "/tmp/pip-build-env-8rlzds3x/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 236, in run_setup
super(_BuildMetaLegacyBackend,
File "/tmp/pip-build-env-8rlzds3x/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 142, in run_setup
exec(compile(code, __file__, 'exec'), locals())
File "setup.py", line 505, in <module>
setup_package()
File "setup.py", line 501, in setup_package
setup(**metadata)
File "/tmp/pip-build-env-8rlzds3x/overlay/lib/python3.8/site-packages/numpy/distutils/core.py", line 137, in setup
config = configuration()
File "setup.py", line 403, in configuration
raise NotFoundError(msg)
numpy.distutils.system_info.NotFoundError: No lapack/blas resources found.
----------------------------------------
ERROR: Command errored out with exit status 1: /usr/local/bin/python /usr/local/lib/python3.8/site-packages/pip/_vendor/pep517/_in_process.py prepare_metadata_for_build_wheel /tmp/tmp6r_mhwge Check the logs for full command output.
The command '/bin/sh -c pip3 install -r requirements.txt' returned a non-zero code: 1

Resources