Install OpenCV for python3 - opencv

I have followed many manuals/tutorials how to install OpenCV, but all seem to work for my python2.7 instead of python3.4 where I want it. I'm following this tutorial but without using virtualenv. When making the
$cmake \
-D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D INSTALL_C_EXAMPLES=OFF \
-D INSTALL_PYTHON_EXAMPLES=ON \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib/modules \
-D BUILD_EXAMPLES=ON \
-D PYTHON_EXECUTABLE=/usr/bin/python3.4 \
-D PYTHON_PACKAGES_PATHS=/usr/local/lib/python3.4/dist-packages/ \
-D PYTHON_NUMPY_INCLUDE_DIRS=/usr/local/lib/python3.4/dist-packages/numpy/core/include ..
comand it list both versions:
-- Python 2:
-- Interpreter: /usr/bin/python2.7 (ver 3.4.3)
-- Libraries: NO
-- numpy: /usr/local/lib/python2.7/dist-packages/numpy/core/include (ver 1.10.4)
-- packages path: lib/python2.7/dist-packages
--
-- Python 3:
-- Interpreter: /usr/bin/python3.4 (ver 3.4.3)
-- Libraries: NO
-- numpy: /usr/local/lib/python3.4/dist-packages/numpy/core/include (ver 1.10.4)
-- packages path: lib/python3.4/dist-packages
--
-- Python (for build): /usr/bin/python2.7
But omits the python_executable flag and uses the python2.7 for building (I checked it worked on python2.7 after continuing with the installation).
How can I make it that it uses python3.4 for the build?
Things I tried:
When running this cmake:
cmake \
-D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=$(python3 -c "import sys; print(sys.prefix)") \
-D PYTHON_EXECUTABLE=$(which python3) ..
It list correctly the libraries:
-- Python 2:
-- Interpreter: /usr/bin/python2.7 (ver 3.4.3)
-- Libraries: NO
-- numpy: /usr/local/lib/python2.7/dist-packages/numpy/core/include (ver 1.10.4)
-- packages path: lib/python2.7/dist-packages
--
-- Python 3:
-- Interpreter: /usr/bin/python3.4 (ver 3.4.3)
-- Libraries: /usr/lib/x86_64-linux-gnu/libpython3.4m.so (ver 3.4.3)
-- numpy: /usr/local/lib/python3.4/dist-packages/numpy/core/include (ver 1.10.4)
-- packages path: lib/python3.4/dist-packages
--
-- Python (for build): /usr/bin/python2.7
But still list the python2.7 to build for.
Related info:
$whereis python3
python3: /usr/bin/python3.4dm-config /usr/bin/python3.4m /usr/bin/python3.4m-config /usr/bin/python3.4-config /usr/bin/python3 /usr/bin/python3.4-dbg-config /usr/bin/python3.4 /usr/bin/python3.4-dbg /usr/bin/python3.4dm /etc/python3 /etc/python3.4 /usr/lib/python3.0 /usr/lib/python3.5 /usr/lib/python3 /usr/lib/python3.4 /usr/lib/python3.2 /usr/lib/python3.1 /usr/lib/python3.3 /usr/bin/X11/python3.4dm-config /usr/bin/X11/python3.4m /usr/bin/X11/python3.4m-config /usr/bin/X11/python3.4-config /usr/bin/X11/python3 /usr/bin/X11/python3.4-dbg-config /usr/bin/X11/python3.4 /usr/bin/X11/python3.4-dbg /usr/bin/X11/python3.4dm /usr/local/lib/python3.4 /usr/include/python3.4m /usr/include/python3.4 /usr/include/python3.4dm /usr/share/python3 /usr/share/man/man1/python3.1.gz

I found the answer, after removing the CMakeCache.txt rm CMakeCache.txtI rerun the cmake command:
cmake \
-D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=$(python3 -c "import sys; print(sys.prefix)") \
-D PYTHON_EXECUTABLE=/usr/bin/python3.4 \
-D BUILD_EXAMPLES=ON \
-D INSTALL_C_EXAMPLES=OFF \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib/modules \
-D INSTALL_PYTHON_EXAMPLES=ON ..
And the output was:
-- Python 2:
-- Interpreter: /usr/bin/python3.4 (ver 3.4.3)
-- Libraries: /usr/lib/x86_64-linux-gnu/libpython3.4m.so (ver 3.4.3)
-- numpy: /usr/local/lib/python3.4/dist-packages/numpy/core/include (ver 1.10.4)
-- packages path: lib/python3.4/dist-packages
--
-- Python 3:
-- Interpreter: /usr/bin/python3.4 (ver 3.4.3)
-- Libraries: /usr/lib/x86_64-linux-gnu/libpython3.4m.so (ver 3.4.3)
-- numpy: /usr/local/lib/python3.4/dist-packages/numpy/core/include (ver 1.10.4)
-- packages path: lib/python3.4/dist-packages
--
-- Python (for build): /usr/bin/python3.4
--
So I keep with the instalation:
make -j4
sudo make install
sudo ldconfig

Related

mpicc takes long time in alpine container

I have a docker container running alpine:latest in which I installed build-base, openmpi, openmpi-dev,.. and basically everything works fine, except when I run
mpicc -v -time=time_out -o /root/cloud/test /root/cloud/mpi_hello_world.c
The preprocessing stage [-E] takes ~90sec for the first time. Second time is less than a second. I attached the -v option to mpiccbelow. Please note that the produced executable runs fine and fast with all my nodes/slots.
What I tried to fix this issue was looking at the verbose output of mpicc -v [...] and between
...
End of search list.
<---- Between these two lines we spend ~85sec estimated ---->
GNU C17 (Alpine 10.3.1_git20211027) version 10.3.1 20211027 (x86_64-alpine-linux-musl)
...
we loose time. I have a hunch, that gcc searches for something which it eventually finds. But I dont know what it is.
Can please someone help me identify the missing element?
Please see the output of the mpicc -v [...] command:
bash-5.1# mpicc -v -time=time_out -o /root/cloud/test /root/cloud/mpi_hello_world.c | tee /root/myFiles/mpicc_verbose
Using built-in specs.
COLLECT_GCC=/usr/bin/gcc
COLLECT_LTO_WRAPPER=/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/lto-wrapper
Target: x86_64-alpine-linux-musl
Configured with: /home/buildozer/aports/main/gcc/src/gcc-10.3.1_git20211027/configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --build=x86_64-alpine-linux-musl --host=x86_64-alpine-linux-musl --target=x86_64-alpine-linux-musl --with-pkgversion='Alpine 10.3.1_git20211027' --enable-checking=release --disable-fixed-point --disable-libstdcxx-pch --disable-multilib --disable-nls --disable-werror --disable-symvers --enable-__cxa_atexit --enable-default-pie --enable-default-ssp --enable-cloog-backend --enable-languages=c,c++,d,objc,go,fortran,ada --disable-libssp --disable-libmpx --disable-libmudflap --disable-libsanitizer --enable-shared --enable-threads --enable-tls --with-system-zlib --with-linker-hash-style=gnu
Thread model: posix
Supported LTO compression algorithms: zlib
gcc version 10.3.1 20211027 (Alpine 10.3.1_git20211027)
COLLECT_GCC_OPTIONS='-v' '-o' '/root/cloud/test' '-mtune=generic' '-march=x86-64'
/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/cc1 -quiet -v /root/cloud/mpi_hello_world.c -quiet -dumpbase mpi_hello_world.c -mtune=generic -march=x86-64 -auxbase mpi_hello_world -version -o /tmp/ccdhIMIE.s
GNU C17 (Alpine 10.3.1_git20211027) version 10.3.1 20211027 (x86_64-alpine-linux-musl)
compiled by GNU C version 10.3.1 20211027, GMP version 6.2.1, MPFR version 4.1.0, MPC version 1.2.1, isl version isl-0.22-GMP
GGC heuristics: --param ggc-min-expand=100 --param ggc-min-heapsize=131072
ignoring nonexistent directory "/usr/local/include"
ignoring nonexistent directory "/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/include"
#include "..." search starts here:
#include <...> search starts here:
/usr/include/fortify
/usr/include
/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/include
End of search list.
GNU C17 (Alpine 10.3.1_git20211027) version 10.3.1 20211027 (x86_64-alpine-linux-musl)
compiled by GNU C version 10.3.1 20211027, GMP version 6.2.1, MPFR version 4.1.0, MPC version 1.2.1, isl version isl-0.22-GMP
GGC heuristics: --param ggc-min-expand=100 --param ggc-min-heapsize=131072
Compiler executable checksum: 3193578801129247e8be66bd6dd0fe05
COLLECT_GCC_OPTIONS='-v' '-o' '/root/cloud/test' '-mtune=generic' '-march=x86-64'
/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/bin/as -v --64 -o /tmp/ccFKfcmE.o /tmp/ccdhIMIE.s
GNU assembler version 2.37 (x86_64-alpine-linux-musl) using BFD version (GNU Binutils) 2.37
COMPILER_PATH=/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/libexec/gcc/x86_64-alpine-linux-musl/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/lib/gcc/x86_64-alpine-linux-musl/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/bin/
LIBRARY_PATH=/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/../lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../:/lib/:/usr/lib/
COLLECT_GCC_OPTIONS='-v' '-o' '/root/cloud/test' '-mtune=generic' '-march=x86-64'
/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/collect2 -plugin /usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/liblto_plugin.so -plugin-opt=/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/lto-wrapper -plugin-opt=-fresolution=/tmp/ccmkCMLh.res -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lc -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lgcc_s --eh-frame-hdr --hash-style=gnu -m elf_x86_64 --as-needed -dynamic-linker /lib/ld-musl-x86_64.so.1 -pie -z relro -z now -o /root/cloud/test /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/Scrt1.o /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/crti.o /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/crtbeginS.o -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1 -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/../lib -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib -L/lib/../lib -L/usr/lib/../lib -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../.. /tmp/ccFKfcmE.o -rpath /usr/lib --enable-new-dtags -lmpi -lssp_nonshared -lgcc --push-state --as-needed -lgcc_s --pop-state -lc -lgcc --push-state --as-needed -lgcc_s --pop-state /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/crtendS.o /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/crtn.o
COLLECT_GCC_OPTIONS='-v' '-o' '/root/cloud/test' '-mtune=generic' '-march=x86-64'
Also here my time_out file:
0.030072 0.006682 cc1 -quiet -v /root/cloud/mpi_hello_world.c -quiet -dumpbase mpi_hello_world.c -mtune=generic -march=x86-64 -auxbase mpi_hello_world -version -o /tmp/ccdhIMIE.s
0.002234 0.0017 as -v --64 -o /tmp/ccFKfcmE.o /tmp/ccdhIMIE.s
0.009905 0.011814 collect2 -plugin /usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/liblto_plugin.so -plugin-opt=/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/lto-wrapper -plugin-opt=-fresolution=/tmp/ccmkCMLh.res -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lc -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lgcc_s --eh-frame-hdr --hash-style=gnu -m elf_x86_64 --as-needed -dynamic-linker /lib/ld-musl-x86_64.so.1 -pie -z relro -z now -o /root/cloud/test /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/Scrt1.o /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/crti.o /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/crtbeginS.o -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1 -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/../lib -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib -L/lib/../lib -L/usr/lib/../lib -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib -L/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../.. /tmp/ccFKfcmE.o -rpath /usr/lib --enable-new-dtags -lmpi -lssp_nonshared -lgcc --push-state --as-needed -lgcc_s --pop-state -lc -lgcc --push-state --as-needed -lgcc_s --pop-state /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/crtendS.o /usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/crtn.o
There doesnt seem to be a problem in the time_out, I mean its fast.
Code is from here: mpi-hello-world/code
Thank you <3
Edit: Please see the Dockerfile
FROM amd64/alpine#sha256:a777c9c66ba177ccfea23f2a216ff6721e78a662cd17019488c417135299cd89 as node
ARG USER=mpiuser
ARG SSH_PATH=/etc/ssh
RUN ping -c 2 8.8.8.8
RUN apk add --no-cache \
bash \
build-base \
libc6-compat \
openmpi openmpi-dev\
openssh \
openrc \
nfs-utils \
neovim \
tini
RUN rm -rf /var/cache/apk
#https://wiki.alpinelinux.org/wiki/Setting_up_a_nfs-server
#https://wiki.alpinelinux.org/wiki/Setting_up_a_SSH_server
RUN adduser -S ${USER} -g "MPI Test User" -s /bin/ash -D ${USER} \
&& echo "${USER} ALL=(ALL) NOPASSWD:ALL" >> /etc/sudoers \
&& echo ${USER}:* | chpasswd \
&& echo root:* | chpasswd
RUN mkdir ~/.ssh \
# && rc-update add sshd \
# && rc-status \
# touch softlevel because system was initialized without openrc
&& echo "PermitRootLogin yes" >> ${SSH_PATH}/sshd_config \
&& echo "PubkeyAuthentication yes" >> ${SSH_PATH}/sshd_config \
&& echo "StrictHostKeyChecking no" >> ${SSH_PATH}/ssh_config \
&& rm /etc/motd
COPY --chmod=770 ./node_script/helper_node.sh /root/
RUN mkdir ~/cloud
# Using tini - All Tini does is spawn a single child (Tini is meant to be run in a container), and wait for it to exit all the while reaping zombies and performing signal forwarding.
# Docu: https://github.com/krallin/tini
ENTRYPOINT ["/sbin/tini", "-g", "-e 143" ,"-e 137", "--", "/root/helper_node.sh"]
And also /root/helper_node.sh:
# Start sshd i.e. ssh server but gracefully make it shout up
/usr/sbin/sshd -D -d -h /root/.ssh/id_rsa -f /etc/ssh/sshd_config > /dev/null 2>&1
Launch with docker-compose: docker-compose rm -fsv;docker-compose build && docker compose up --scale node=4
Edit 2 - Reproduce with mpicc -E; mpicc -S; mpicc -C (commands omitted due to readability) and we see the same behaviour.
But funny observation mpicc -v -E [...] gives:
mpicc -v -E -o test.i /root/cloud/mpi_hello_world.c
Using built-in specs.
COLLECT_GCC=/usr/bin/gcc
Target: x86_64-alpine-linux-musl
Configured with: /home/buildozer/aports/main/gcc/src/gcc-10.3.1_git20211027/configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --build=x86_64-alpine-linux-musl --host=x86_64-alpine-linux-musl --target=x86_64-alpine-linux-musl --with-pkgversion='Alpine 10.3.1_git20211027' --enable-checking=release --disable-fixed-point --disable-libstdcxx-pch --disable-multilib --disable-nls --disable-werror --disable-symvers --enable-__cxa_atexit --enable-default-pie --enable-default-ssp --enable-cloog-backend --enable-languages=c,c++,d,objc,go,fortran,ada --disable-libssp --disable-libmpx --disable-libmudflap --disable-libsanitizer --enable-shared --enable-threads --enable-tls --with-system-zlib --with-linker-hash-style=gnu
Thread model: posix
Supported LTO compression algorithms: zlib
gcc version 10.3.1 20211027 (Alpine 10.3.1_git20211027)
COLLECT_GCC_OPTIONS='-v' '-E' '-o' 'test.i' '-mtune=generic' '-march=x86-64'
/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/cc1 -E -quiet -v /root/cloud/mpi_hello_world.c -o test.i -mtune=generic -march=x86-64
ignoring nonexistent directory "/usr/local/include"
ignoring nonexistent directory "/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/include"
#include "..." search starts here:
#include <...> search starts here:
/usr/include/fortify
/usr/include
/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/include
End of search list.
<------------------ Wait time here ------------------->
COMPILER_PATH=/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/libexec/gcc/x86_64-alpine-linux-musl/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/lib/gcc/x86_64-alpine-linux-musl/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/bin/
LIBRARY_PATH=/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/../lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../:/lib/:/usr/lib/
COLLECT_GCC_OPTIONS='-v' '-E' '-o' 'test.i' '-mtune=generic' '-march=x86-64'
Temporary fix - If I add
export COMPILER_PATH=/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/libexec/gcc/x86_64-alpine-linux-musl/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/lib/gcc/x86_64-alpine-linux-musl/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/bin/
export LIBRARY_PATH=/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/../lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../:/lib/:/usr/lib/
to /etc/profile and source /etc/profile everything works as fine as one could wish :)
Temporary fix. Add
export COMPILER_PATH=/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/libexec/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/libexec/gcc/x86_64-alpine-linux-musl/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/lib/gcc/x86_64-alpine-linux-musl/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/bin/
export LIBRARY_PATH=/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/../lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../../x86_64-alpine-linux-musl/lib/:/usr/lib/gcc/x86_64-alpine-linux-musl/10.3.1/../../../:/lib/:/usr/lib/
to /etc/profile and then source /etc/profile.
Everything works as fine as one could wish :)

Dockerfile: Python.h: No such file or directory

I have a simple dockerfile that I am using to run containers on AWS, I'm hitting an issue though when installing s3fs, which is strange since I've used this snippet in previous dockerfiles without issue.
Is it something with the distribution?
Error:
multidict/_multidict.c:1:10: fatal error: Python.h: No such file or directory
Dockerfile:
FROM amazonlinux:latest
RUN yum -y install which unzip aws-cli \
&& yum install -y python3-pip python3 python3-setuptools \
&& yum install -y tar.x86_64 \
&& DEBIAN_FRONTEND=noninteractive yum install -y ksh
RUN pip3 install boto3 \
&& yum install -y gcc \
&& pip3 install s3fs
Here is the output log:
Installing collected packages: fsspec, docutils, botocore, typing-extensions, aioitertools, wrapt, attrs, chardet, multidict, async-timeout, idna, yarl, aiohttp, aiobotocore, s3fs
Found existing installation: botocore 1.19.11
Uninstalling botocore-1.19.11:
Successfully uninstalled botocore-1.19.11
Running setup.py install for wrapt: started
Running setup.py install for wrapt: finished with status 'done'
Running setup.py install for multidict: started
Running setup.py install for multidict: finished with status 'error'
Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-l35mgnnl/multidict/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-7d6e5wlf-record/install-record.txt --single-version-externally-managed --compile:
**********************
* Accellerated build *
**********************
running install
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.7
creating build/lib.linux-x86_64-3.7/multidict
copying multidict/_abc.py -> build/lib.linux-x86_64-3.7/multidict
copying multidict/__init__.py -> build/lib.linux-x86_64-3.7/multidict
copying multidict/_multidict_base.py -> build/lib.linux-x86_64-3.7/multidict
copying multidict/_multidict_py.py -> build/lib.linux-x86_64-3.7/multidict
copying multidict/_compat.py -> build/lib.linux-x86_64-3.7/multidict
running egg_info
writing multidict.egg-info/PKG-INFO
writing dependency_links to multidict.egg-info/dependency_links.txt
writing top-level names to multidict.egg-info/top_level.txt
reading manifest file 'multidict.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.pyc' found anywhere in distribution
warning: no previously-included files found matching 'multidict/_multidict.html'
warning: no previously-included files found matching 'multidict/*.so'
warning: no previously-included files found matching 'multidict/*.pyd'
warning: no previously-included files found matching 'multidict/*.pyd'
no previously-included directories found matching 'docs/_build'
writing manifest file 'multidict.egg-info/SOURCES.txt'
copying multidict/__init__.pyi -> build/lib.linux-x86_64-3.7/multidict
copying multidict/_multidict.c -> build/lib.linux-x86_64-3.7/multidict
copying multidict/py.typed -> build/lib.linux-x86_64-3.7/multidict
creating build/lib.linux-x86_64-3.7/multidict/_multilib
copying multidict/_multilib/defs.h -> build/lib.linux-x86_64-3.7/multidict/_multilib
copying multidict/_multilib/dict.h -> build/lib.linux-x86_64-3.7/multidict/_multilib
copying multidict/_multilib/istr.h -> build/lib.linux-x86_64-3.7/multidict/_multilib
copying multidict/_multilib/iter.h -> build/lib.linux-x86_64-3.7/multidict/_multilib
copying multidict/_multilib/pair_list.h -> build/lib.linux-x86_64-3.7/multidict/_multilib
copying multidict/_multilib/views.h -> build/lib.linux-x86_64-3.7/multidict/_multilib
running build_ext
building 'multidict._multidict' extension
creating build/temp.linux-x86_64-3.7
creating build/temp.linux-x86_64-3.7/multidict
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python3.7m -c multidict/_multidict.c -o build/temp.linux-x86_64-3.7/multidict/_multidict.o -O2 -std=c99 -Wall -Wsign-compare -Wconversion -fno-strict-aliasing -pedantic
multidict/_multidict.c:1:10: fatal error: Python.h: No such file or directory
#include "Python.h"
^~~~~~~~~~
compilation terminated.
error: command 'gcc' failed with exit status 1
----------------------------------------
Command "/usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-l35mgnnl/multidict/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-7d6e5wlf-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-l35mgnnl/multidict/
The command '/bin/sh -c pip3 install boto3 && yum install -y gcc && pip3 install s3fs' returned a non-zero code: 1
Any help is much appreciated!
You should add python3-devel package that contains absent headers.
FROM amazonlinux:latest
RUN yum -y install which unzip aws-cli \
&& yum install -y python3-pip python3 python3-setuptools \
&& yum install -y python3-devel.x86_64 \
&& yum install -y tar.x86_64 \
&& DEBIAN_FRONTEND=noninteractive yum install -y ksh
RUN pip3 install boto3 \
&& yum install -y gcc \
&& pip3 install s3fs

Gdal installation in Alpine compilation failure - "error: command 'gcc' failed with exit status 1"

I am trying to install Gdal in Alpine docker env.
I installed the dependencies of Gdal and it went fine
RUN apk add --no-cache gcc build-base /gdal/gdal-dev-2.4.0-r1.apk /gdal/gdal-2.4.0-r1.apk /gdal/geos-3.7.1-r0.apk /gdal/libcrypto1.1-1.1.1b-r1.apk
Then I ran the command "pip install gdal"
It downloads GDAL-3.0.0.tar.gz but ends up with error while installing.
Pruned logs;
Building wheels for collected packages: gdal
Building wheel for gdal (setup.py) ... error
ERROR: Complete output from command /usr/local/bin/python -u -c 'import setuptools, tokenize;__file__='"'"'/tmp/pip-install-hlldvrpz/gdal/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-dj2y5pji --python-tag cp37:
ERROR: WARNING: numpy not available! Array support will not be enabled
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.7
copying gdal.py -> build/lib.linux-x86_64-3.7
copying ogr.py -> build/lib.linux-x86_64-3.7
...
...
...
running build_ext
gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -DTHREAD_STACK_SIZE=0x100000 -fPIC -I../../port -I../../gcore -I../../alg -I../../ogr/ -I../../ogr/ogrsf_frmts -I../../gnm -I../../apps -I/usr/local/include/python3.7m -I. -I/usr/include -c gdal_python_cxx11_test.cpp -o gdal_python_cxx11_test.o
building 'osgeo._gdal' extension
creating build/temp.linux-x86_64-3.7
creating build/temp.linux-x86_64-3.7/extensions
gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -DTHREAD_STACK_SIZE=0x100000 -fPIC -I../../port -I../../gcore -I../../alg -I../../ogr/ -I../../ogr/ogrsf_frmts -I../../gnm -I../../apps -I/usr/local/include/python3.7m -I. -I/usr/include -c extensions/gdal_wrap.cpp -o build/temp.linux-x86_64-3.7/extensions/gdal_wrap.o -I/usr/include
extensions/gdal_wrap.cpp: In function 'OSRSpatialReferenceShadow* GDALDatasetShadow_GetSpatialRef(GDALDatasetShadow*)':
extensions/gdal_wrap.cpp:4672:54: error: 'GDALGetSpatialRef' was not declared in this scope
OGRSpatialReferenceH ref = GDALGetSpatialRef(self);
^
extensions/gdal_wrap.cpp: In function 'void GDALDatasetShadow_SetSpatialRef(GDALDatasetShadow*, OSRSpatialReferenceShadow*)':
extensions/gdal_wrap.cpp:4681:57: error: 'GDALSetSpatialRef' was not declared in this scope
GDALSetSpatialRef( self, (OGRSpatialReferenceH)srs );
...
...
In file included from /usr/local/include/python3.7m/Python.h:147:0,
from extensions/gdal_wrap.cpp:173:
/usr/local/include/python3.7m/abstract.h:489:17: note: declared here
PyAPI_FUNC(int) PyObject_AsReadBuffer(PyObject *obj,
^~~~~~~~~~~~~~~~~~~~~
error: command 'gcc' failed with exit status 1
----------------------------------------
ERROR: Failed building wheel for gdal
Running setup.py clean for gdal
...
...
error: command 'gcc' failed with exit status 1
----------------------------------------
ERROR: Command "/usr/local/bin/python -u -c 'import setuptools, tokenize;__file__='"'"'/tmp/pip-install-hlldvrpz/gdal/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-fapa0jlw/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-install-hlldvrpz/gdal/
It seems there's a GDAL version conflict:
pip install gdal pulls the sources tarball for GDAL 3.0.0 and attempts to build it from source;
The 2.4.0-r1 gdal-dev package is being installed, which is indeed the latest available in Alpine repositories;
GDAL 3.0.0 won't build against the 2.4.0-r1 gdal-dev headers.
As a workaround, installing the GDAL module of the matching version, 2.4.0, may be successful:
pip install 'gdal==2.4.0'
You don't have a full python development environment.
apk add --no-cache gcc g++ python python-dev py-pip mysql-dev linux-headers libffi-dev openssl-dev

open cv cmake can't find png and jpeg

I downloaded opencv 3.2,and run:
cmake -D CMAKE_BUILD_TYPE=RELEASE
-D CMAKE_INSTALL_PREFIX=$(python -c "import sys; print(sys.prefix)") -D PYTHON_EXECUTABLE=$(which python)
-D OPENCV_EXTRA_MODULES_PATH=/home/alex/Software/opencv_contrib/modules
-D WITH_OPENGL=ON
-D WITH_JPPEG=ON
-D WITH_PNG=ON
-D WITH_CUDA=ON
-D ENABLE_FAST_MATH=1
-D WITH_CUBLAS=1 ..
under ubuntu 16.04 got:
- Media I/O:
....
-- JPEG: NO
-- WEBP: build (ver 0.3.1)
-- PNG: NO
........
as you can see that both JPEG and PNG give 'NO',and I hava got libjpeg-dev,libgtk2.0-dev and libpng-dev installed.So how to make opencv to support jpeg and png?

Compiling Opencv with Gstreamer, cmake not finding GStreamer

I want to build opencv with GStreamer support.
I built the GStreamer from source (version 1.8.1) following this guide:
http://kacianka.at/?p=145
I have 'gstreamer_build' folder at my home directory and it contains 'bin' folder with these:
gst-device-monitor-1.0
gst-discoverer-1.0 gst-inspect-1.0
gst-launch-1.0 gst-play-1.0
gst-stats-1.0 gst-typefind-1.0
orc-bugreport orcc
I have this path added to my environment variable PATH.
When I use cmake like:
cmake -D CMAKE_BUILD_TYPE=RELEASE -D
CMAKE_INSTALL_PREFIX=/usr/local -D
OPENCV_EXTRA_MODULES_PATH=~/opencv_contrib/modules -D BUILD_opencv_python3=ON -D WITH_GSTREAMER=ON -D WITH_FFMPEG=OFF
..
I get the following output clearly indicating that gstreamer is not found:
-- checking for module 'gstreamer-base-1.0'
-- package 'gstreamer-base-1.0' not found
-- checking for module 'gstreamer-video-1.0'
-- package 'gstreamer-video-1.0' not found
-- checking for module 'gstreamer-app-1.0'
-- package 'gstreamer-app-1.0' not found
-- checking for module 'gstreamer-riff-1.0'
-- package 'gstreamer-riff-1.0' not found
-- checking for module 'gstreamer-pbutils-1.0'
-- package 'gstreamer-pbutils-1.0' not found
-- checking for module 'gstreamer-base-0.10'
-- package 'gstreamer-base-0.10' not found
-- checking for module 'gstreamer-video-0.10'
-- package 'gstreamer-video-0.10' not found
-- checking for module 'gstreamer-app-0.10'
-- package 'gstreamer-app-0.10' not found
-- checking for module 'gstreamer-riff-0.10'
-- package 'gstreamer-riff-0.10' not found
-- checking for module 'gstreamer-pbutils-0.10'
-- package 'gstreamer-pbutils-0.10' not found
and this:
Video I/O:
-- DC1394 1.x: NO
-- DC1394 2.x: NO
-- FFMPEG: NO
-- codec: NO
-- format: NO
-- util: NO
-- swscale: NO
-- resample: NO
-- gentoo-style: NO
-- GStreamer: NO
-- OpenNI: NO
-- OpenNI PrimeSensor Modules: NO
-- OpenNI2: NO
-- PvAPI: NO
-- GigEVisionSDK: NO
-- UniCap: NO
-- UniCap ucil: NO
-- V4L/V4L2: Using libv4l1 (ver 1.0.1) / libv4l2 (ver 1.0.1)
-- XIMEA: NO
-- Xine: NO
-- gPhoto2: NO
Can anyone help me with this?
I had the same problem.
gstreamer-base corresponds to libgstbase-1.0.so (or libgstbase-0.10.so), found in package libgstreamer1.0-0 (or libgstreamer0.10-0, as the case may be). Below, we install the '-dev' package.
The other libraries (libgst-video, libgst-app, libgst-riff, libgst-pbutils) I found in package libgstreamer-plugins-base1.0-dev (again, substitute the version you wish to use, either v0.1, or v1.0).
Therefore, the following command should be used to install the missing dependencies:
sudo apt install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev
Repeat the cmake command, possibly purging the contents of the build directory beforehand.
The below worked for me if you are developing just a Gstreamer applicaiton
# GStreamer CMake building
cmake_minimum_required(VERSION 3.3)
project(GStreamerHello)
set(PKG_CONFIG_USE_CMAKE_PREFIX_PATH ON)
find_package(PkgConfig REQUIRED)
if ( NOT (PKGCONFIG_FOUND))
message(FATAL_ERROR "Please Install PPkgConfig: CMake will Exit")
endif()
pkg_check_modules(GST REQUIRED gstreamer-1.0>=1.8)
if ( NOT (GST_FOUND))
message(FATAL_ERROR "Please Install Gstreamer Dev: CMake will Exit")
endif()
set(ENV{PKG_CONFIG_PATH})
include_directories("${GST_INCLUDE_DIRS}")
link_libraries(${GST_LIBRARIES})
add_executable(gstreamerSrvc src/hello_gstreamer.cc)
add_dependencies(gstreamerSrvc vsphere_header )
target_link_libraries(gstreamerSrvc ${GST_LIBRARIES} )
Note - If you need a dev docker for GStreamer it is below; and for your question it has the parts of compiling with OpenCV as well;
More details at https://medium.com/techlogs/compiling-opencv-for-cuda-for-yolo-and-other-cnn-libraries-9ce427c00ff8
FROM nvidia/cuda
# This is a dev image, needed to compile OpenCV with CUDA
# Install Gstreamer and OpenCV Pre-requisite libs
RUN apt-get update -y && apt-get install -y \
libgstreamer1.0-0 \
gstreamer1.0-plugins-base \
gstreamer1.0-plugins-good \
gstreamer1.0-plugins-bad \
gstreamer1.0-plugins-ugly \
gstreamer1.0-libav \
gstreamer1.0-doc \
gstreamer1.0-tools \
libgstreamer1.0-dev \
libgstreamer-plugins-base1.0-dev
RUN apt-get update -y && apt-get install -y pkg-config \
zlib1g-dev libwebp-dev \
libtbb2 libtbb-dev \
libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev \
cmake
RUN apt-get install -y \
autoconf \
autotools-dev \
build-essential \
gcc \
git
ENV OPENCV_RELEASE_TAG 3.4.5
RUN git clone https://github.com/opencv/opencv.git /var/local/git/opencv
RUN cd /var/local/git/opencv && \
git checkout tags/${OPENCV_RELEASE_TAG}
RUN mkdir -p /var/local/git/opencv/build && \
cd /var/local/git/opencv/build $$ && \
cmake -D CMAKE_BUILD_TYPE=Release -D BUILD_PNG=OFF -D \
BUILD_TIFF=OFF -D BUILD_TBB=OFF -D BUILD_JPEG=ON \
-D BUILD_JASPER=OFF -D BUILD_ZLIB=ON -D BUILD_EXAMPLES=OFF \
-D BUILD_opencv_java=OFF -D BUILD_opencv_python2=ON \
-D BUILD_opencv_python3=OFF -D ENABLE_NEON=OFF -D WITH_OPENCL=OFF \
-D WITH_OPENMP=OFF -D WITH_FFMPEG=OFF -D WITH_GSTREAMER=ON -D WITH_GSTREAMER_0_10=OFF \
-D WITH_CUDA=ON -D CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ -D WITH_GTK=ON \
-D WITH_VTK=OFF -D WITH_TBB=ON -D WITH_1394=OFF -D WITH_OPENEXR=OFF \
-D CUDA_ARCH_BIN=6.0 6.1 7.0 -D CUDA_ARCH_PTX="" -D INSTALL_C_EXAMPLES=OFF -D INSTALL_TESTS=OFF ..
RUN cd /var/local/git/opencv/build && \
make install
# Install other tools you need for development
On Windows there is no "sudo apt install..." I also had all the paths set right in my PATH environment variable, and still had the same problem. I've got this working after setting following CMake Options:
only set "WITH_GSTREAMER" option to True, "WITH_GSTREAMER_0_10" MUST BE FALSE
add new entry "GSTREAMER_DIR"=(path to gstreamer)
for me it was "C:/gstreamer/1.0/x86_64"
I found this solution here
My OpenCV version: 3.4.3
Adding new entry "GSTREAMER_DIR"=(path to gstreamer) worked for me (WITH_GSTREAMER true). Did not have a WITH_GSTREAMER_0_10 in my version.

Resources