Problems extending Airflow image for Helm - docker

I'm using HELM to install Airflow in a Kubernetes cluster. I wanted the pods to have an additional apt dependency (sshpass) and as such, I was trying to extend the Airflow image Helm uses and add that apt dependency using Docker. My Dockerfile looks like this:
FROM apache/airflow:1.10.12-python3.6
USER root
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
sshpass \
&& apt-get autoremove -yqq --purge \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
USER airflow
I followed the example given by Airflow doc here: https://airflow.apache.org/docs/apache-airflow/stable/production-deployment.html#extending-the-image, but replaced the image with the one Helm uses.
I then built and published the new image, I made Helm use it by editing the values.yaml file like so:
airflow:
## configs for the docker image of the web/scheduler/worker
##
image:
repository: myaccount/myrepo
tag: mytag
## values: Always or IfNotPresent
pullPolicy: IfNotPresent
pullSecret: ""
I then ran the installation command but the pods that are launched go into CrashLoopBackOff and Error states. I checked the logs of one of the pods which is:
*** installing requirements...
Collecting airflow-multi-dagrun==1.2 (from -r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/af/41/e60dff951d002dbf14daf601b1323dfc48c0d24d2bc4e7d19ac72b19c3f6/airflow_multi_dagrun-1.2-py3-none-any.whl
Collecting azure-storage-blob==12.4.0 (from -r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/60/dc/3c25aeab827266c019e13abc07f31b5f47d93f1b8548a417c81c89c9d021/azure_storage_blob-12.4.0-py2.py3-none-any.whl (326kB)
Collecting azure-cosmosdb-table==1.0.6 (from -r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/f0/e4/15a59108883cc47460b1475aeac935e2d975b5def42f2c0a8b8fd48b3304/azure_cosmosdb_table-1.0.6-py2.py3-none-any.whl (125kB)
Collecting pandas==1.1.2 (from -r requirements.txt (line 4))
Downloading https://files.pythonhosted.org/packages/1c/11/e1f53db0614f2721027aab297c8afd2eaf58d33d566441a97ea454541c5e/pandas-1.1.2-cp36-cp36m-manylinux1_x86_64.whl (10.5MB)
Collecting pyarrow==1.0.1 (from -r requirements.txt (line 5))
Downloading https://files.pythonhosted.org/packages/1f/32/34b44246e671a2ba672fc2551dd3a85472ba07e58a75da1dbc655810fbfa/pyarrow-1.0.1-cp36-cp36m-manylinux2010_x86_64.whl (17.5MB)
Collecting azure-core<2.0.0,>=1.6.0 (from azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/12/9e/6bb67fe85f6a89d71f50c86a0da778a5064f749a485ed9ba498067034227/azure_core-1.10.0-py2.py3-none-any.whl (125kB)
Collecting msrest>=0.6.10 (from azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/fa/f5/9e315fe8cb985b0ce052b34bcb767883dc739f46fadb62f05a7e6d6eedbe/msrest-0.6.19-py2.py3-none-any.whl (84kB)
Collecting cryptography>=2.1.4 (from azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/c9/de/7054df0620b5411ba45480f0261e1fb66a53f3db31b28e3aa52c026e72d9/cryptography-3.3.1-cp36-abi3-manylinux2010_x86_64.whl (2.6MB)
Collecting azure-common>=1.1.5 (from azure-cosmosdb-table==1.0.6->-r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/19/2b/46ada1753c4a640bc3ad04a1e20b1a5ea52a8f18079e1b8238e536aa0c98/azure_common-1.1.26-py2.py3-none-any.whl
Collecting requests (from azure-cosmosdb-table==1.0.6->-r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/29/c1/24814557f1d22c56d50280771a17307e6bf87b70727d975fd6b2ce6b014a/requests-2.25.1-py2.py3-none-any.whl (61kB)
Collecting azure-cosmosdb-nspkg>=2.0.0 (from azure-cosmosdb-table==1.0.6->-r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/58/0d/1329b47e5386b0acf4e42ada2284851eff60ef3337a87e5b2dfedabbfcb1/azure_cosmosdb_nspkg-2.0.2-py2.py3-none-any.whl
Collecting python-dateutil (from azure-cosmosdb-table==1.0.6->-r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/d4/70/d60450c3dd48ef87586924207ae8907090de0b306af2bce5d134d78615cb/python_dateutil-2.8.1-py2.py3-none-any.whl (227kB)
Collecting pytz>=2017.2 (from pandas==1.1.2->-r requirements.txt (line 4))
Downloading https://files.pythonhosted.org/packages/89/06/2c2d3034b4d6bf22f2a4ae546d16925898658a33b4400cfb7e2c1e2871a3/pytz-2020.5-py2.py3-none-any.whl (510kB)
Collecting numpy>=1.15.4 (from pandas==1.1.2->-r requirements.txt (line 4))
Downloading https://files.pythonhosted.org/packages/14/32/d3fa649ad7ec0b82737b92fefd3c4dd376b0bb23730715124569f38f3a08/numpy-1.19.5-cp36-cp36m-manylinux2010_x86_64.whl (14.8MB)
Collecting six>=1.6 (from azure-core<2.0.0,>=1.6.0->azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/ee/ff/48bde5c0f013094d729fe4b0316ba2a24774b3ff1c52d924a8a4cb04078a/six-1.15.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from msrest>=0.6.10->azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/5e/a0/5f06e1e1d463903cf0c0eebeb751791119ed7a4b3737fdc9a77f1cdfb51f/certifi-2020.12.5-py2.py3-none-any.whl (147kB)
Collecting requests-oauthlib>=0.5.0 (from msrest>=0.6.10->azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/a3/12/b92740d845ab62ea4edf04d2f4164d82532b5a0b03836d4d4e71c6f3d379/requests_oauthlib-1.3.0-py2.py3-none-any.whl
Collecting isodate>=0.6.0 (from msrest>=0.6.10->azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/9b/9f/b36f7774ff5ea8e428fdcfc4bb332c39ee5b9362ddd3d40d9516a55221b2/isodate-0.6.0-py2.py3-none-any.whl (45kB)
Collecting cffi>=1.12 (from cryptography>=2.1.4->azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/1c/1a/90fa7e7ee05d91d0339ef264bd8c008f57292aba4a91ec512a0bbb379d1d/cffi-1.14.4-cp36-cp36m-manylinux1_x86_64.whl (401kB)
Collecting chardet<5,>=3.0.2 (from requests->azure-cosmosdb-table==1.0.6->-r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/19/c7/fa589626997dd07bd87d9269342ccb74b1720384a4d739a1872bd84fbe68/chardet-4.0.0-py2.py3-none-any.whl (178kB)
Collecting urllib3<1.27,>=1.21.1 (from requests->azure-cosmosdb-table==1.0.6->-r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/f5/71/45d36a8df68f3ebb098d6861b2c017f3d094538c0fb98fa61d4dc43e69b9/urllib3-1.26.2-py2.py3-none-any.whl (136kB)
Collecting idna<3,>=2.5 (from requests->azure-cosmosdb-table==1.0.6->-r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/a2/38/928ddce2273eaa564f6f50de919327bf3a00f091b5baba8dfa9460f3a8a8/idna-2.10-py2.py3-none-any.whl (58kB)
Collecting azure-nspkg>=2.0.0 (from azure-cosmosdb-nspkg>=2.0.0->azure-cosmosdb-table==1.0.6->-r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/c4/0c/c562be95a9a2ed52454f598571cf300b1114d0db2aa27f5b8ed3bb9cd0c0/azure_nspkg-3.0.2-py3-none-any.whl
Collecting oauthlib>=3.0.0 (from requests-oauthlib>=0.5.0->msrest>=0.6.10->azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/05/57/ce2e7a8fa7c0afb54a0581b14a65b56e62b5759dbc98e80627142b8a3704/oauthlib-3.1.0-py2.py3-none-any.whl (147kB)
Collecting pycparser (from cffi>=1.12->cryptography>=2.1.4->azure-storage-blob==12.4.0->-r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/ae/e7/d9c3a176ca4b02024debf82342dab36efadfc5776f9c8db077e8f6e71821/pycparser-2.20-py2.py3-none-any.whl (112kB)
Installing collected packages: airflow-multi-dagrun, six, chardet, urllib3, idna, certifi, requests, azure-core, oauthlib, requests-oauthlib, isodate, msrest, pycparser, cffi, cryptography, azure-storage-blob, azure-common, azure-nspkg,azure-cosmosdb-nspkg, python-dateutil, azure-cosmosdb-table, pytz, numpy, pandas, pyarrow
The script chardetect is installed in '/root/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
The scripts f2py, f2py3 and f2py3.6 are installed in '/root/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
The script plasma_store is installed in '/root/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed airflow-multi-dagrun-1.2 azure-common-1.1.26 azure-core-1.10.0 azure-cosmosdb-nspkg-2.0.2 azure-cosmosdb-table-1.0.6 azure-nspkg-3.0.2 azure-storage-blob-12.4.0 certifi-2020.12.5 cffi-1.14.4 chardet-4.0.0 cryptography-3.3.1 idna-2.10 isodate-0.6.0 msrest-0.6.19 numpy-1.19.5 oauthlib-3.1.0 pandas-1.1.2 pyarrow-1.0.1 pycparser-2.20 python-dateutil-2.8.1 pytz-2020.5 requests-2.25.1 requests-oauthlib-1.3.0 six-1.15.0 urllib3-1.26.2
You are using pip version 19.0.2, however version 20.3.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
*** installing extra pip packages...
Requirement already satisfied: azure-storage-blob==12.4.0 in /root/.local/lib/python3.6/site-packages (12.4.0)
Requirement already satisfied: azure-core<2.0.0,>=1.6.0 in /root/.local/lib/python3.6/site-packages (from azure-storage-blob==12.4.0) (1.10.0)
Requirement already satisfied: msrest>=0.6.10 in /root/.local/lib/python3.6/site-packages (from azure-storage-blob==12.4.0) (0.6.19)
Requirement already satisfied: cryptography>=2.1.4 in /root/.local/lib/python3.6/site-packages (from azure-storage-blob==12.4.0) (3.3.1)
Requirement already satisfied: six>=1.6 in /root/.local/lib/python3.6/site-packages (from azure-core<2.0.0,>=1.6.0->azure-storage-blob==12.4.0) (1.15.0)
Requirement already satisfied: requests>=2.18.4 in /root/.local/lib/python3.6/site-packages (from azure-core<2.0.0,>=1.6.0->azure-storage-blob==12.4.0) (2.25.1)
Requirement already satisfied: requests-oauthlib>=0.5.0 in /root/.local/lib/python3.6/site-packages (from msrest>=0.6.10->azure-storage-blob==12.4.0) (1.3.0)
Requirement already satisfied: certifi>=2017.4.17 in /root/.local/lib/python3.6/site-packages (from msrest>=0.6.10->azure-storage-blob==12.4.0) (2020.12.5)
Requirement already satisfied: isodate>=0.6.0 in /root/.local/lib/python3.6/site-packages (from msrest>=0.6.10->azure-storage-blob==12.4.0) (0.6.0)
Requirement already satisfied: cffi>=1.12 in /root/.local/lib/python3.6/site-packages (from cryptography>=2.1.4->azure-storage-blob==12.4.0) (1.14.4)
Requirement already satisfied: idna<3,>=2.5 in /root/.local/lib/python3.6/site-packages (from requests>=2.18.4->azure-core<2.0.0,>=1.6.0->azure-storage-blob==12.4.0) (2.10)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /root/.local/lib/python3.6/site-packages (from requests>=2.18.4->azure-core<2.0.0,>=1.6.0->azure-storage-blob==12.4.0) (1.26.2)
Requirement already satisfied: chardet<5,>=3.0.2 in /root/.local/lib/python3.6/site-packages (from requests>=2.18.4->azure-core<2.0.0,>=1.6.0->azure-storage-blob==12.4.0) (4.0.0)
Requirement already satisfied: oauthlib>=3.0.0 in /root/.local/lib/python3.6/site-packages (from requests-oauthlib>=0.5.0->msrest>=0.6.10->azure-storage-blob==12.4.0) (3.1.0)
Requirement already satisfied: pycparser in /root/.local/lib/python3.6/site-packages (from cffi>=1.12->cryptography>=2.1.4->azure-storage-blob==12.4.0) (2.20)
You are using pip version 19.0.2, however version 20.3.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
*** running webserver...
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 23, in <module>
import argcomplete
ModuleNotFoundError: No module named 'argcomplete'
From what I can gather here, it seems that the python dependencies Airflow needs are being installed with the root user (/root/.local/lib/python3.6/site-packages) instead of the airflow user, that is then used to execute airflow that therefore doesn't have said dependencies. Without me trying to extend the Airflow image and just using the default one in Helm chart, the installation python path is /home/airflow/.local/lib/python3.6/site-packages and everything else goes well. Why does my custom image behave like this?
Do I need to add anything else to Docker file? Am I doing something wrong? How can I fix this?
Thank you in advance

Related

can you help me to install kivy from vscode?

error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> [11 lines of output]
DEPRECATION: --no-binary currently disables reading from the cache of locally built wheels. In the future --no-binary
will not influence the wheel cache. pip 23.1 will enforce this behaviour change. A possible replacement is to use the --no-cache-dir option. You can use the flag --use-feature=no-binary-enable-wheel-cache to test the upcoming behaviour. Discussion
can be found at https://github.com/pypa/pip/issues/11453
Collecting setuptools
Using cached setuptools-65.6.3-py3-none-any.whl (1.2 MB)
Collecting wheel
Using cached wheel-0.38.4-py3-none-any.whl (36 kB)
Collecting cython!=0.27,!=0.27.2,<=0.29.28,>=0.24
Using cached Cython-0.29.28-py2.py3-none-any.whl (983 kB)
Collecting kivy_deps.gstreamer_dev~=0.3.3
Using cached kivy_deps.gstreamer_dev-0.3.3-cp311-cp311-win_amd64.whl (3.9 MB)
ERROR: Could not find a version that satisfies the requirement kivy_deps.sdl2_dev~=0.4.5 (from versions: 0.5.1)
ERROR: No matching distribution found for kivy_deps.sdl2_dev~=0.4.5
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip
install kivy in vscode from windows

I Can't install python shapely package on Alpine-based docker image

I try to build a custom docker image from nodered/node-red image (alpine-based). I use many ways and show many solution in google, but i still got error when installing shapely packege. My Dockerfile is:
FROM nodered/node-red
USER root
RUN apk update
RUN apk add py3-pip
RUN pip install pymap3d
RUN apk --update add build-base libxslt-dev
RUN apk add --virtual .build-deps \
--repository http://dl-cdn.alpinelinux.org/alpine/edge/testing \
--repository http://dl-cdn.alpinelinux.org/alpine/edge/main \
gcc libc-dev geos-dev geos && \
runDeps="$(scanelf --needed --nobanner --recursive /usr/local \
| awk '{ gsub(/,/, "\nso:", $2); print "so:" $2 }' \
| xargs -r apk info --installed \
| sort -u)" && \
apk add --virtual .rundeps $runDeps
RUN geos-config --cflags
#RUN pip install --disable-pip-version-check shapely
RUN apk del build-base python3-dev && \
rm -rf /var/cache/apk/*
RUN pip install shapely
I used any solution in internet but i can't fix error. I seen here and here and here and here and some other pages and my problem not solved.
Update:
docker build message is:
Step 10/10 : RUN pip install shapely
---> Running in a5f6d97702e7
Collecting shapely
Downloading shapely-2.0.0.tar.gz (274 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 274.5/274.5 kB 956.0 kB/s eta 0:00:00
Installing build dependencies: started
Installing build dependencies: finished with status 'error'
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> [301 lines of output]
Collecting Cython~=0.29
Downloading Cython-0.29.32-cp310-cp310-musllinux_1_1_x86_64.whl (2.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 1.5 MB/s eta 0:00:00
Collecting oldest-supported-numpy
Downloading oldest_supported_numpy-2022.11.19-py3-none-any.whl (4.9 kB)
Collecting setuptools>=61.0.0
Downloading setuptools-65.6.3-py3-none-any.whl (1.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 1.6 MB/s eta 0:00:00
Collecting numpy==1.21.6
Downloading numpy-1.21.6.zip (10.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.3/10.3 MB 1.6 MB/s eta 0:00:00
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Building wheels for collected packages: numpy
Building wheel for numpy (pyproject.toml): started
Building wheel for numpy (pyproject.toml): finished with status 'error'
error: subprocess-exited-with-error
× Building wheel for numpy (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [270 lines of output]
setup.py:63: RuntimeWarning: NumPy 1.21.6 may not yet support Python 3.10.
warnings.warn(
Running from numpy source directory.
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/tools/cythonize.py:69: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
from distutils.version import LooseVersion
Processing numpy/random/_bounded_integers.pxd.in
Processing numpy/random/mtrand.pyx
Processing numpy/random/_sfc64.pyx
Processing numpy/random/bit_generator.pyx
Processing numpy/random/_philox.pyx
Processing numpy/random/_mt19937.pyx
Processing numpy/random/_common.pyx
Processing numpy/random/_generator.pyx
Processing numpy/random/_pcg64.pyx
Processing numpy/random/_bounded_integers.pyx.in
Cythonizing sources
blas_opt_info:
blas_mkl_info:
customize UnixCCompiler
libraries mkl_rt not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
blis_info:
libraries blis not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
openblas_info:
libraries openblas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
accelerate_info:
NOT AVAILABLE
atlas_3_10_blas_threads_info:
Setting PTATLAS=ATLAS
libraries tatlas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_3_10_blas_info:
libraries satlas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_blas_threads_info:
Setting PTATLAS=ATLAS
libraries ptf77blas,ptcblas,atlas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_blas_info:
libraries f77blas,cblas,atlas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:2026: UserWarning:
Optimized (vendor) Blas libraries are not found.
Falls back to netlib Blas library which has worse performance.
A better performance should be easily gained by switching
Blas library.
if self._calc_info(blas):
blas_info:
libraries blas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:2026: UserWarning:
Blas (http://www.netlib.org/blas/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [blas]) or by setting
the BLAS environment variable.
if self._calc_info(blas):
blas_src_info:
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:2026: UserWarning:
Blas (http://www.netlib.org/blas/) sources not found.
Directories to search for the sources can be specified in the
numpy/distutils/site.cfg file (section [blas_src]) or by setting
the BLAS_SRC environment variable.
if self._calc_info(blas):
NOT AVAILABLE
non-existing path in 'numpy/distutils': 'site.cfg'
lapack_opt_info:
lapack_mkl_info:
libraries mkl_rt not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
openblas_lapack_info:
libraries openblas not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
openblas_clapack_info:
libraries openblas,lapack not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
flame_info:
libraries flame not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
atlas_3_10_threads_info:
Setting PTATLAS=ATLAS
libraries lapack_atlas not found in /usr/local/lib
libraries tatlas,tatlas not found in /usr/local/lib
libraries lapack_atlas not found in /usr/lib
libraries tatlas,tatlas not found in /usr/lib
libraries lapack_atlas not found in /usr/lib/
libraries tatlas,tatlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_3_10_threads_info'>
NOT AVAILABLE
atlas_3_10_info:
libraries lapack_atlas not found in /usr/local/lib
libraries satlas,satlas not found in /usr/local/lib
libraries lapack_atlas not found in /usr/lib
libraries satlas,satlas not found in /usr/lib
libraries lapack_atlas not found in /usr/lib/
libraries satlas,satlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_3_10_info'>
NOT AVAILABLE
atlas_threads_info:
Setting PTATLAS=ATLAS
libraries lapack_atlas not found in /usr/local/lib
libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib
libraries lapack_atlas not found in /usr/lib
libraries ptf77blas,ptcblas,atlas not found in /usr/lib
libraries lapack_atlas not found in /usr/lib/
libraries ptf77blas,ptcblas,atlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_threads_info'>
NOT AVAILABLE
atlas_info:
libraries lapack_atlas not found in /usr/local/lib
libraries f77blas,cblas,atlas not found in /usr/local/lib
libraries lapack_atlas not found in /usr/lib
libraries f77blas,cblas,atlas not found in /usr/lib
libraries lapack_atlas not found in /usr/lib/
libraries f77blas,cblas,atlas not found in /usr/lib/
<class 'numpy.distutils.system_info.atlas_info'>
NOT AVAILABLE
lapack_info:
libraries lapack not found in ['/usr/local/lib', '/usr/lib', '/usr/lib/']
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:1858: UserWarning:
Lapack (http://www.netlib.org/lapack/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [lapack]) or by setting
the LAPACK environment variable.
return getattr(self, '_calc_info_{}'.format(name))()
lapack_src_info:
NOT AVAILABLE
/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/system_info.py:1858: UserWarning:
Lapack (http://www.netlib.org/lapack/) sources not found.
Directories to search for the sources can be specified in the
numpy/distutils/site.cfg file (section [lapack_src]) or by setting
the LAPACK_SRC environment variable.
return getattr(self, '_calc_info_{}'.format(name))()
NOT AVAILABLE
numpy_linalg_lapack_lite:
FOUND:
language = c
define_macros = [('HAVE_BLAS_ILP64', None), ('BLAS_SYMBOL_SUFFIX', '64_')]
Warning: attempted relative import with no known parent package
/usr/lib/python3.10/distutils/dist.py:274: UserWarning: Unknown distribution option: 'define_macros'
warnings.warn(msg)
running bdist_wheel
running build
running config_cc
unifing config_cc, config, build_clib, build_ext, build commands --compiler options
running config_fc
unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options
running build_src
build_src
building py_modules sources
creating build
creating build/src.linux-x86_64-3.10
creating build/src.linux-x86_64-3.10/numpy
creating build/src.linux-x86_64-3.10/numpy/distutils
building library "npymath" sources
Could not locate executable gfortran
Could not locate executable f95
Could not locate executable ifort
Could not locate executable ifc
Could not locate executable lf95
Could not locate executable pgfortran
Could not locate executable nvfortran
Could not locate executable f90
Could not locate executable f77
Could not locate executable fort
Could not locate executable efort
Could not locate executable efc
Could not locate executable g77
Could not locate executable g95
Could not locate executable pathf95
Could not locate executable nagfor
Could not locate executable frt
don't know how to compile Fortran code on platform 'posix'
creating build/src.linux-x86_64-3.10/numpy/core
creating build/src.linux-x86_64-3.10/numpy/core/src
creating build/src.linux-x86_64-3.10/numpy/core/src/npymath
conv_template:> build/src.linux-x86_64-3.10/numpy/core/src/npymath/npy_math_internal.h
adding 'build/src.linux-x86_64-3.10/numpy/core/src/npymath' to include_dirs.
conv_template:> build/src.linux-x86_64-3.10/numpy/core/src/npymath/ieee754.c
conv_template:> build/src.linux-x86_64-3.10/numpy/core/src/npymath/npy_math_complex.c
None - nothing done with h_files = ['build/src.linux-x86_64-3.10/numpy/core/src/npymath/npy_math_internal.h']
building library "npyrandom" sources
building extension "numpy.core._multiarray_tests" sources
creating build/src.linux-x86_64-3.10/numpy/core/src/multiarray
conv_template:> build/src.linux-x86_64-3.10/numpy/core/src/multiarray/_multiarray_tests.c
building extension "numpy.core._multiarray_umath" sources
Traceback (most recent call last):
File "/tmp/tmp0bhr5u48_in_process.py", line 363, in <module>
main()
File "/tmp/tmp0bhr5u48_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/tmp/tmp0bhr5u48_in_process.py", line 261, in build_wheel
return _build_backend().build_wheel(wheel_directory, config_settings,
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 230, in build_wheel
return self._build_with_temp_dir(['bdist_wheel'], '.whl',
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 215, in _build_with_temp_dir
self.run_setup()
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 267, in run_setup
super(_BuildMetaLegacyBackend,
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 158, in run_setup
exec(compile(code, __file__, 'exec'), locals())
File "setup.py", line 448, in <module>
setup_package()
File "setup.py", line 440, in setup_package
setup(**metadata)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/core.py", line 169, in setup
return old_setup(**new_attr)
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/setuptools/__init__.py", line 153, in setup
return distutils.core.setup(**attrs)
File "/usr/lib/python3.10/distutils/core.py", line 148, in setup
dist.run_commands()
File "/usr/lib/python3.10/distutils/dist.py", line 966, in run_commands
self.run_command(cmd)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/tmp/pip-build-env-eya7durf/overlay/lib/python3.10/site-packages/wheel/bdist_wheel.py", line 299, in run
self.run_command('build')
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build.py", line 61, in run
old_build.run(self)
File "/usr/lib/python3.10/distutils/command/build.py", line 135, in run
self.run_command(cmd_name)
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build_src.py", line 144, in run
self.build_sources()
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build_src.py", line 161, in build_sources
self.build_extension_sources(ext)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build_src.py", line 318, in build_extension_sources
sources = self.generate_sources(sources, ext)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/distutils/command/build_src.py", line 378, in generate_sources
source = func(extension, build_dir)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/core/setup.py", line 434, in generate_config_h
moredefs, ignored = cocache.check_types(config_cmd, ext, build_dir)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/core/setup.py", line 44, in check_types
out = check_types(*a, **kw)
File "/tmp/pip-install-frs2c34j/numpy_afe54773ba8d409e8f86463616547cf4/numpy/core/setup.py", line 289, in check_types
raise SystemError(
SystemError: Cannot compile 'Python.h'. Perhaps you need to install python-dev|python-devel.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for numpy
Failed to build numpy
ERROR: Could not build wheels for numpy, which is required to install pyproject.toml-based projects
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
From this: https://github.com/docker-library/python/issues/381#issuecomment-463880366 I added the suggestion in the Dockerfile and the build now succeeds.
New Dockerfile:
FROM nodered/node-red
USER root
RUN apk update
RUN apk add py3-pip
RUN pip install pymap3d
RUN apk --update add build-base libxslt-dev
RUN apk add --virtual .build-deps \
--repository http://dl-cdn.alpinelinux.org/alpine/edge/testing \
--repository http://dl-cdn.alpinelinux.org/alpine/edge/main \
gcc libc-dev geos-dev geos && \
runDeps="$(scanelf --needed --nobanner --recursive /usr/local \
| awk '{ gsub(/,/, "\nso:", $2); print "so:" $2 }' \
| xargs -r apk info --installed \
| sort -u)" && \
apk add --virtual .rundeps $runDeps
RUN geos-config --cflags
#RUN pip install --disable-pip-version-check shapely
#RUN apk del build-base python3-dev && \
# rm -rf /var/cache/apk/*
RUN apk add --no-cache python3-dev libstdc++ && \
apk add --no-cache g++ && \
ln -s /usr/include/locale.h /usr/include/xlocale.h && \
pip3 install numpy && \
pip3 install pandas
RUN pip install shapely
Build output:

How to launch Jupyter with custom extension using a docker

I am trying to launch, via docker, a jupyterlab with a custom extension(extensionTest). However, I haven't been successful.
Can someone tell me how can i do it? Or put here an example?
What is the best base docker image to use?
Thank you
Best regards
I tried doing this dockerfile, but is not working :
FROM jupyter/minimal-notebook:lab-3.2.3
RUN pip install --no-cache-dir \
astropy \
ipytree \
ipywidgets \
jupyter \
numpy \
poliastro
RUN jupyter labextension install \
jupyterlab-plotly#4.14.2 \
plotlywidget#4.14.2
RUN pip install jupyterlab_widgets
COPY ./extensions/ ./extensions/
WORKDIR ./extensions/
RUN python -m pip install ./extensionTest
RUN jupyter labextension extensionTest
ENTRYPOINT start.sh jupyter lab
Thanks
Error that I get :
ERROR: Command errored out with exit status 1:
command: /opt/conda/bin/python /opt/conda/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmp2m0biqur
cwd: /home/jovyan/extensions/extensionTest
Complete output (44 lines):
INFO:hatch_jupyter_builder.utils:Running jupyter-builder
INFO:hatch_jupyter_builder.utils:Building with hatch_jupyter_builder.npm_builder
INFO:hatch_jupyter_builder.utils:With kwargs: {'build_cmd': 'build:prod', 'npm': ['jlpm']}
INFO:hatch_jupyter_builder.utils:Installing build dependencies with npm. This may take a while...
INFO:hatch_jupyter_builder.utils:> /tmp/pip-build-env-gj35ixm0/overlay/bin/jlpm install
yarn install v1.21.1
info No lockfile found.
[1/4] Resolving packages...
warning #jupyterlab/application > #jupyterlab/apputils > url > querystring#0.2.0: The querystring API is considered Legacy. new code should use the URLSearchParams API instead.
warning #jupyterlab/application > #jupyterlab/ui-components > #blueprintjs/core > popper.js#1.16.1: You can find the new Popper v2 at #popperjs/core, this package is dedicated to the legacy v1
warning #jupyterlab/application > #jupyterlab/ui-components > #blueprintjs/core > react-popper > popper.js#1.16.1: You can find the new Popper v2 at #popperjs/core, this package is dedicated to the legacy v1
warning #jupyterlab/builder > terser-webpack-plugin > cacache > #npmcli/move-file#1.1.2: This functionality has been moved to #npmcli/fs
warning #jupyterlab/builder > #jupyterlab/buildutils > crypto#1.0.1: This package is no longer supported. It's now a built-in Node module. If you've depended on crypto, you should switch to the one that's built-in.
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request#2.88.0: request has been deprecated, see https://github.com/request/request/issues/3142
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request > har-validator#5.1.5: this library is no longer supported
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request > uuid#3.4.0: Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details.
[2/4] Fetching packages...
warning #blueprintjs/core#3.54.0: Invalid bin entry for "upgrade-blueprint-2.0.0-rename" (in "#blueprintjs/core").
warning #blueprintjs/core#3.54.0: Invalid bin entry for "upgrade-blueprint-3.0.0-rename" (in "#blueprintjs/core").
error lib0#0.2.58: The engine "node" is incompatible with this module. Expected version ">=14". Got "12.4.0"
error Found incompatible module.
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
Traceback (most recent call last):
File "/opt/conda/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
main()
File "/opt/conda/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/opt/conda/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 261, in build_wheel
return _build_backend().build_wheel(wheel_directory, config_settings,
File "/tmp/pip-build-env-gj35ixm0/overlay/lib/python3.9/site-packages/hatchling/build.py", line 41, in build_wheel
return os.path.basename(next(builder.build(wheel_directory, ['standard'])))
File "/tmp/pip-build-env-gj35ixm0/overlay/lib/python3.9/site-packages/hatchling/builders/plugin/interface.py", line 136, in build
build_hook.initialize(version, build_data)
File "/tmp/pip-build-env-gj35ixm0/normal/lib/python3.9/site-packages/hatch_jupyter_builder/plugin.py", line 83, in initialize
raise e
File "/tmp/pip-build-env-gj35ixm0/normal/lib/python3.9/site-packages/hatch_jupyter_builder/plugin.py", line 78, in initialize
build_func(self.target_name, version, **build_kwargs)
File "/tmp/pip-build-env-gj35ixm0/normal/lib/python3.9/site-packages/hatch_jupyter_builder/utils.py", line 114, in npm_builder
run(npm_cmd + ["install"], cwd=str(abs_path))
File "/tmp/pip-build-env-gj35ixm0/normal/lib/python3.9/site-packages/hatch_jupyter_builder/utils.py", line 227, in run
return subprocess.check_call(cmd, **kwargs)
File "/opt/conda/lib/python3.9/subprocess.py", line 373, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/tmp/pip-build-env-gj35ixm0/overlay/bin/jlpm', 'install']' returned non-zero exit status 1.
So, to anyone interested i was able to create a docker file that builds and run you jupyter extension. My code is this:
FROM node:14 AS build-env
RUN apt-get update && \
apt-get install -y python3-pip && \
pip3 install jupyterlab
COPY Path/to/Extension Path/to/Extension
WORKDIR Path/to/extension
RUN yarn install && yarn build && yarn run build
FROM jupyter/minimal-notebook:lab-3.2.3
RUN pip install --no-cache-dir \
astropy \
ipytree \
ipywidgets \
jupyter \
numpy \
poliastro
RUN jupyter labextension install \
jupyterlab-plotly#4.14.2 \
#jupyter-widgets/jupyterlab-manager \
plotlywidget#4.14.2
COPY --from=build-env Path/to/Extension/on/Nodejs Path/to/Extension
RUN jupyter labextension install Path/to/Extension
If someone knows something that can be simplified, let me know :)

Dockerfile to answer Y & N for single command

My dockerfile has below,
FROM ubuntu:20.04
RUN apt install -y aptitude
RUN aptitude -f install python-dev
Manual aptitude -f install python-dev installation inside the container prompts option to choose n and Y, Same as follows.
The following NEW packages will be installed:
libpython-dev{a} libpython-stdlib{a} python{a} python-dev python-minimal{a}
0 packages upgraded, 5 newly installed, 0 to remove and 5 not upgraded.
Need to get 182 kB of archives. After unpacking 898 kB will be used.
The following packages have unmet dependencies:
libpython2-stdlib : Breaks: libpython-stdlib (< 2.7.15-2) but 2.7.11-1 is to be installed
python2 : Breaks: python (< 2.7.15-2) but 2.7.11-1 is to be installed
python2-minimal : Breaks: python-minimal (< 2.7.15-2) but 2.7.11-1 is to be installed
python-is-python2 : Breaks: python but 2.7.11-1 is to be installed
The following actions will resolve these dependencies:
Keep the following packages at their current version:
1) libpython-stdlib [Not Installed]
2) python [Not Installed]
3) python-dev [Not Installed]
4) python-minimal [Not Installed]
Accept this solution? [Y/n/q/?] n
The following actions will resolve these dependencies:
Remove the following packages:
1) libpython2-stdlib [2.7.17-2ubuntu4 (focal, now)]
2) mic [0.28.6 (now)]
3) python-is-python2 [2.7.17-4 (focal, now)]
4) python-pycurl [7.43.0.2-1ubuntu5 (focal, now)]
5) python-pyparsing [2.4.6-1 (focal, now)]
6) python-rpm [4.14.2.1+dfsg1-1build2 (focal, now)]
7) python-urlgrabber [3.9.1-4ubuntu3 (now)]
8) python2 [2.7.17-2ubuntu4 (focal, now)]
9) python2-minimal [2.7.17-2ubuntu4 (focal, now)]
Leave the following dependencies unresolved:
10) python2-minimal recommends python2
Accept this solution? [Y/n/q/?] y
My goal is to build the docker image through dockerfile. So in Dockerfile how to respond the n and y prompts automatically for the above case? Any help would be highly appreciated.
You can disable interactive prompting from apt by setting the DEBIAN_FRONTEND environment variable to noninteractive. This is documented in the debconf(7) man page. That would look something like:
FROM ubuntu:20.04
RUN apt update && DEBIAN_FRONTEND=noninteractive apt install -y python-dev

About Automatic playing YouTube video into clubhouse

I am trying to write some codes for automatically playing one online radio into clubhouse Android application, so if possible I like to have some comment for:
Finding some python or php codes to play automatically live-streaming from YouTube by
using the Google Colaboratory environment, like the result of the below Google search results:
Automate clubhouse Messages With Python
Update 1:
I have found two result from the Google, which could be seen below:
A third-part web application based on flask to play Clubhouse
audio
A simple script that will watch a stream for you and earn the
channel points.
And tried to run the above first instruction on the Google colab (link) but get the below error:
!git clone https://github.com/ai-eks/OpenClubhouse-Worker
%cd OpenClubhouse-Worker
!pip install -r requirements.txt
Cloning into 'OpenClubhouse-Worker'...
remote: Enumerating objects: 61, done.
remote: Counting objects: 100% (61/61), done.
remote: Compressing objects: 100% (47/47), done.
remote: Total 61 (delta 31), reused 38 (delta 14), pack-reused 0
Unpacking objects: 100% (61/61), done.
/content/OpenClubhouse-Worker
Collecting pymongo==3.11.3
Downloading pymongo-3.11.3-cp37-cp37m-manylinux2014_x86_64.whl (512 kB)
|████████████████████████████████| 512 kB 5.2 MB/s
Collecting requests==2.25.1
Downloading requests-2.25.1-py2.py3-none-any.whl (61 kB)
|████████████████████████████████| 61 kB 6.8 MB/s
Collecting requests-openapi==0.9.7
Downloading requests_openapi-0.9.7-py3-none-any.whl (5.1 kB)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests==2.25.1->-r requirements.txt (line 2)) (1.24.3)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests==2.25.1->-r requirements.txt (line 2)) (2.10)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests==2.25.1->-r requirements.txt (line 2)) (2021.5.30)
Requirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests==2.25.1->-r requirements.txt (line 2)) (3.0.4)
Requirement already satisfied: pyyaml in /usr/local/lib/python3.7/dist-packages (from requests-openapi==0.9.7->-r requirements.txt (line 3)) (3.13)
Installing collected packages: requests, requests-openapi, pymongo
Attempting uninstall: requests
Found existing installation: requests 2.23.0
Uninstalling requests-2.23.0:
Successfully uninstalled requests-2.23.0
Attempting uninstall: pymongo
Found existing installation: pymongo 3.12.0
Uninstalling pymongo-3.12.0:
Successfully uninstalled pymongo-3.12.0
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
google-colab 1.0.0 requires requests~=2.23.0, but you have requests 2.25.1 which is incompatible.
datascience 0.10.6 requires folium==0.2.1, but you have folium 0.8.3 which is incompatible.
Successfully installed pymongo-3.11.3 requests-2.25.1 requests-openapi-0.9.7
So I am trying to make it by updating the question and working based of the comments.
Update 2:
Based on the help of Marco the first error fixed, and I have got a new error, as you can see below:
Traceback (most recent call last):
File "main.py", line 38, in <module>
main()
File "main.py", line 26, in main
chh = ClubHouseHelper(phone=phone, url=api_uri, device_id=device_id)
File "/content/OpenClubhouse-Worker/OpenClubhouse-Worker/OpenClubhouse-Worker/OpenClubhouse-Worker/ch_helper.py", line 9, in __init__
self.client.load_spec_from_file(url)
File "/usr/local/lib/python3.7/dist-packages/requests_openapi/core.py", line 252, in load_spec_from_file
spec = load_spec_from_file(file_path)
File "/usr/local/lib/python3.7/dist-packages/requests_openapi/core.py", line 159, in load_spec_from_file
return yaml.load(spec_str, Loader=yaml.Loader)
File "/usr/local/lib/python3.7/dist-packages/yaml/__init__.py", line 70, in load
loader = Loader(stream)
File "/usr/local/lib/python3.7/dist-packages/yaml/loader.py", line 34, in __init__
Reader.__init__(self, stream)
File "/usr/local/lib/python3.7/dist-packages/yaml/reader.py", line 74, in __init__
self.check_printable(stream)
File "/usr/local/lib/python3.7/dist-packages/yaml/reader.py", line 144, in check_printable
'unicode', "special characters are not allowed")
yaml.reader.ReaderError: unacceptable character #x1f579: special characters are not allowed
in "<unicode string>", position 6373
Thanks.

Resources