Installing hdf5-openmpi-devel in Fedora 23 - hdf5

I am trying to install hdf5-openmpi-devel in Fedora 23. When I run
sudo dnf install hdf5-openmpi-devel
I get
Last metadata expiration check: 1:12:51 ago on Mon Oct 30 22:53:47 2017.
Package hdf5-openmpi-devel-1.8.15-10.patch1.fc23.x86_64 is already installed, skipping.
Dependencies resolved.
Nothing to do.
Complete!
But when I check with ldconfig, I don't see any openmpi files:
rg#supersg: ldconfig -p | grep hdf5
libhdf5hl_fortran.so.10 (libc6,x86-64) => /lib64/libhdf5hl_fortran.so.10
libhdf5_hl_cpp.so.10 (libc6,x86-64) => /lib64/libhdf5_hl_cpp.so.10
libhdf5_hl.so.10 (libc6,x86-64) => /lib64/libhdf5_hl.so.10
libhdf5_fortran.so.10 (libc6,x86-64) => /lib64/libhdf5_fortran.so.10
libhdf5_cpp.so.10 (libc6,x86-64) => /lib64/libhdf5_cpp.so.10
libhdf5.so.10 (libc6,x86-64) => /lib64/libhdf5.so.10
Any ideas how to install hdf5-openmpi-devel in Fedora 23?

The libs are in /usr/lib64/openmpi/lib
ldd /usr/lib64/openmpi/lib/libhdf5.so.8
linux-vdso.so.1 => (0x00007fffa2d3f000)
libz.so.1 => /lib64/libz.so.1 (0x00007fade3235000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007fade3030000)
libm.so.6 => /lib64/libm.so.6 (0x00007fade2d2e000)
libmpi.so.12 => /usr/lib64/openmpi/lib/libmpi.so.12 (0x00007fade2a4a000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007fade282d000)
libc.so.6 => /lib64/libc.so.6 (0x00007fade246a000)
/lib64/ld-linux-x86-64.so.2 (0x0000559a79df2000)
libopen-rte.so.12 => /usr/lib64/openmpi/lib/libopen-rte.so.12 (0x00007fade21ee000)
libopen-pal.so.13 => /usr/lib64/openmpi/lib/libopen-pal.so.13 (0x00007fade1f49000)
librt.so.1 => /lib64/librt.so.1 (0x00007fade1d41000)
libutil.so.1 => /lib64/libutil.so.1 (0x00007fade1b3e000)
libhwloc.so.5 => /lib64/libhwloc.so.5 (0x00007fade1900000)
libnuma.so.1 => /lib64/libnuma.so.1 (0x00007fade16f4000)
libpciaccess.so.0 => /lib64/libpciaccess.so.0 (0x00007fade14e9000)
libxml2.so.2 => /lib64/libxml2.so.2 (0x00007fade117f000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007fade0f69000)
liblzma.so.5 => /lib64/liblzma.so.5 (0x00007fade0d42000)
they should be automatically picked if you use the MPI wrappers mpicc, mpifort and friends

Related

Debug in Docker: Connection was not established. Probably 'xdebug.remote_host=host.docker.internal'

I've read countless articles, including this which sounded almost exactly like my setup/issue:
Xdebug inside Docker on Mac M1 2021 is not working
I'm getting the error message mentioned in the title, still.
Here is how I spin up the container (which ships with PHPUnit and XDebug):
docker run --rm -it -v $(pwd):/app -v $(pwd)/xdebug.ini:/usr/local/etc/php/conf.d/docker-php-ext-xdebug.ini jitesoft/phpunit /bin/ash
Here is the debug.ini I use to overwrite the defaults:
;
; NOTE: We want to echo these contents to this file location
;
; PATH: /usr/local/etc/php/conf.d/docker-php-ext-xdebug.ini
;
zend_extension=/usr/local/lib/php/extensions/no-debug-non-zts-20220829/xdebug.so
xdebug.mode=debug
xdebug.idekey=PHPSTORM
xdebug.start_with_request=yes
xdebug.discover_client_host=0
xdebug.client_host=host.docker.internal
xdebug.log=/var/log/xdebug.log
Here is the debug config dump:
[1m__ __ _ _
[1m\ \ / / | | | |
[1m \ V / __| | ___| |__ _ _ __ _
[1m > < / _` |/ _ \ '_ \| | | |/ _` |
[1m / . \ (_| | __/ |_) | |_| | (_| |
[1m/_/ \_\__,_|\___|_.__/ \__,_|\__, |
[1m __/ |
[1m |___/
[0mVersion => 3.2.0
Support Xdebug on Patreon, GitHub, or as a business: https://xdebug.org/support
Enabled Features (through 'XDEBUG_MODE' env variable)
Feature => Enabled/Disabled
Development Helpers => ✘ disabled
Coverage => ✔ enabled
GC Stats => ✘ disabled
Profiler => ✘ disabled
Step Debugger => ✘ disabled
Tracing => ✘ disabled
Optional Features
Compressed File Support => no
Clock Source => clock_gettime
'xdebug://gateway' pseudo-host support => yes
'xdebug://nameserver' pseudo-host support => no
Systemd Private Temp Directory => not enabled
Diagnostic Log
No messages
PHP
Build Configuration
Version (Run Time) => 8.2.1
Version (Compile Time) => 8.2.1
Debug Build => no
Thread Safety => disabled
Settings
Configuration File (php.ini) Path => /usr/local/etc/php
Loaded Configuration File => (none)
Scan this dir for additional .ini files => /usr/local/etc/php/conf.d
Additional .ini files parsed => /usr/local/etc/php/conf.d/date_timezone.ini,
/usr/local/etc/php/conf.d/docker-php-ext-xdebug.ini,
/usr/local/etc/php/conf.d/memory-limit.ini
Directive => Local Value => Master Value
xdebug.mode (through XDEBUG_MODE) => coverage => debug
xdebug.start_with_request => yes => yes
xdebug.start_upon_error => default => default
xdebug.output_dir => /tmp => /tmp
xdebug.use_compression => 0 => 0
xdebug.trigger_value => no value => no value
xdebug.file_link_format => no value => no value
xdebug.filename_format => no value => no value
xdebug.log => /var/log/xdebug.log => /var/log/xdebug.log
xdebug.log_level => 7 => 7
xdebug.var_display_max_children => 128 => 128
xdebug.var_display_max_data => 512 => 512
xdebug.var_display_max_depth => 3 => 3
xdebug.max_nesting_level => 256 => 256
xdebug.cli_color => 0 => 0
xdebug.force_display_errors => Off => Off
xdebug.force_error_reporting => 0 => 0
xdebug.halt_level => 0 => 0
xdebug.max_stack_frames => -1 => -1
xdebug.show_error_trace => Off => Off
xdebug.show_exception_trace => Off => Off
xdebug.show_local_vars => Off => Off
xdebug.dump.COOKIE => no value => no value
xdebug.dump.ENV => no value => no value
xdebug.dump.FILES => no value => no value
xdebug.dump.GET => no value => no value
xdebug.dump.POST => no value => no value
xdebug.dump.REQUEST => no value => no value
xdebug.dump.SERVER => no value => no value
xdebug.dump.SESSION => no value => no value
xdebug.dump_globals => On => On
xdebug.dump_once => On => On
xdebug.dump_undefined => Off => Off
xdebug.profiler_output_name => cachegrind.out.%p => cachegrind.out.%p
xdebug.profiler_append => Off => Off
xdebug.cloud_id => no value => no value
xdebug.client_host => host.docker.internal => host.docker.internal
xdebug.client_port => 9003 => 9003
xdebug.discover_client_host => Off => Off
xdebug.client_discovery_header => HTTP_X_FORWARDED_FOR,REMOTE_ADDR => HTTP_X_FORWARDED_FOR,REMOTE_ADDR
xdebug.idekey => PHPSTORM => PHPSTORM
xdebug.connect_timeout_ms => 200 => 200
xdebug.scream => Off => Off
xdebug.gc_stats_output_name => gcstats.%p => gcstats.%p
xdebug.trace_output_name => trace.%c => trace.%c
xdebug.trace_format => 0 => 0
xdebug.trace_options => 0 => 0
xdebug.collect_assignments => Off => Off
xdebug.collect_return => Off => Off
Here is the result on the console when I try and run index.php for debugging:
[docker://jitesoft/phpunit:latest/]:php -dxdebug.mode=debug -dxdebug.client_port=9000 -dxdebug.client_host=host.docker.internal -dxdebug.client_host=host.docker.internal /opt/project/index.php
Process finished with exit code 0
Two things jump out at me
The top of the debug configure dump suggests step debugging is still disabled???
The path to the php script I want to execute on the docker side is /app/index.php not /opt/project/index.php
Thoughts?

docker with tensorflow gpu - ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory

Im trying to run docker with tensorflow using Nvidia GPUs, however when I run my container I get the following error:
pgp_1 | Traceback (most recent call last):
pgp_1 | File "/opt/app-root/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow.py", line 58, in <module>
pgp_1 | from tensorflow.python.pywrap_tensorflow_internal import *
pgp_1 | File "/opt/app-root/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 28, in <module>
pgp_1 | _pywrap_tensorflow_internal = swig_import_helper()
pgp_1 | File "/opt/app-root/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 24, in swig_import_helper
pgp_1 | _mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
pgp_1 | File "/opt/app-root/lib64/python3.6/imp.py", line 243, in load_module
pgp_1 | return load_dynamic(name, filename, file)
pgp_1 | File "/opt/app-root/lib64/python3.6/imp.py", line 343, in load_dynamic
pgp_1 | return _load(spec)
pgp_1 | ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory
Docker-compose
My docker compose file looks like:
version: '3'
services:
pgp:
devices:
- /dev/nvidia0
- /dev/nvidia1
- /dev/nvidia2
- /dev/nvidia3
- /dev/nvidia4
- /dev/nvidiactl
- /dev/nvidia-uvm
image: "myimg/pgp"
ports:
- "5000:5000"
environment:
- LD_LIBRARY_PATH=/opt/local/cuda/lib64/
- GPU_DEVICE=4
- NVIDIA_VISIBLE_DEVICES all
- NVIDIA_DRIVER_CAPABILITIES compute,utility
volumes:
- ./train_package:/opt/app-root/src/train_package
- /usr/local/cuda/lib64/:/opt/local/cuda/lib64/
As you can see, I have tried having a volume to map host cuda to the docker container but this didnt help.
I am able to successfully run nvidia-docker run --rm nvidia/cuda nvidia-smi
Versions
Cuda
cat /usr/local/cuda/version.txt shows CUDA Version 9.0.176
nvcc -V
nvcc -V
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2017 NVIDIA Corporation
Built on Fri_Sep__1_21:08:03_CDT_2017
Cuda compilation tools, release 9.0, V9.0.176
nvidia-docker version
NVIDIA Docker: 2.0.3
Client:
Version: 17.12.1-ce
API version: 1.35
Go version: go1.9.4
Git commit: 7390fc6
Built: Tue Feb 27 22:17:40 2018
OS/Arch: linux/amd64
Server:
Engine:
Version: 17.12.1-ce
API version: 1.35 (minimum version 1.12)
Go version: go1.9.4
Git commit: 7390fc6
Built: Tue Feb 27 22:16:13 2018
OS/Arch: linux/amd64
Experimental: false
Tensorflow
1.5 with gpu support, via pip
ldconfig -p | grep cuda
libnvrtc.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc.so.9.0
libnvrtc.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc.so
libnvrtc-builtins.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc-builtins.so.9.0
libnvrtc-builtins.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc-builtins.so
libnvgraph.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvgraph.so.9.0
libnvgraph.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvgraph.so
libnvblas.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvblas.so.9.0
libnvblas.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvblas.so
libnvToolsExt.so.1 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvToolsExt.so.1
libnvToolsExt.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvToolsExt.so
libnpps.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnpps.so.9.0
libnpps.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnpps.so
libnppitc.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppitc.so.9.0
libnppitc.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppitc.so
libnppisu.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppisu.so.9.0
libnppisu.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppisu.so
libnppist.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppist.so.9.0
libnppist.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppist.so
libnppim.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppim.so.9.0
libnppim.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppim.so
libnppig.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppig.so.9.0
libnppig.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppig.so
libnppif.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppif.so.9.0
libnppif.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppif.so
libnppidei.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppidei.so.9.0
libnppidei.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppidei.so
libnppicom.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppicom.so.9.0
libnppicom.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppicom.so
libnppicc.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppicc.so.9.0
libnppicc.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppicc.so
libnppial.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppial.so.9.0
libnppial.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppial.so
libnppc.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppc.so.9.0
libnppc.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnppc.so
libicudata.so.55 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libicudata.so.55
libcusparse.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcusparse.so.9.0
libcusparse.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcusparse.so
libcusolver.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcusolver.so.9.0
libcusolver.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcusolver.so
libcurand.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcurand.so.9.0
libcurand.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcurand.so
libcuinj64.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcuinj64.so.9.0
libcuinj64.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcuinj64.so
libcufftw.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcufftw.so.9.0
libcufftw.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcufftw.so
libcufft.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcufft.so.9.0
libcufft.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcufft.so
libcudart.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcudart.so.9.0
libcudart.so.7.5 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libcudart.so.7.5
libcudart.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcudart.so
libcudart.so (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libcudart.so
libcuda.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libcuda.so.1
libcuda.so (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libcuda.so
libcublas.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcublas.so.9.0
libcublas.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libcublas.so
libaccinj64.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libaccinj64.so.9.0
libaccinj64.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libaccinj64.so
libOpenCL.so.1 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libOpenCL.so.1
libOpenCL.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libOpenCL.so
Tests with Tensorflow on Docker vs host
The following works, when running on the host:
python3 -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
v1.5.0-0-g37aa430d84 1.5.0
Run container
nvidia-docker run -d --name testtfgpu -p 8888:8888 -p 6006:6006 gcr.io/tensorflow/tensorflow:latest-gpu
Log in
nvidia-docker exec -it testtfgpu bash
Test Tensorflow version
pip show tensorflow-gpu shows:
pip show tensorflow-gpu
Name: tensorflow-gpu
Version: 1.6.0
Summary: TensorFlow helps the tensors flow
Home-page: https://www.tensorflow.org/
Author: Google Inc.
Author-email: opensource#google.com
License: Apache 2.0
Location: /usr/local/lib/python2.7/dist-packages
Requires: astor, protobuf, gast, tensorboard, six, wheel, absl-py, backports.weakref, termcolor, enum34, numpy, grpcio, mock
Python 2
python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
Results in:
Illegal instruction (core dumped)
Python 3
python3 -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
Results in:
python3 -c "import tensorflow as tf; print(tf.GIT_
Traceback (most recent call last):
File "<string>", line 1, in <module>
ImportError: No module named 'tensorflow'
The problem because of your cuDNN version. Tensorflow-GPU 1.5 version will support cuDNN 7.0._ version. You can download that from here. Make sure that your CUDA version 9.0._ and cuDNN version 7.0._ . Please refer link in here for more details.
It looks like a conflict between CUDA's version and TensorFlow's
First, try to check your CUDA version with one of the commands such as nvcc --version or cat /usr/local/cuda/version.txt
If that's 8.x, you may need to reinstall CUDA or simpler, downgrade TensorFlow to 1.4. Otherwise, if your CUDA is 9.x, you need TensorFlow 1.5 or newer.
Hope that helps.

NVidia driver libraries in nvidia/cuda image

I want to run ffmpeg with cuvid hw-accelerated decoding in the container based on official nvidia/cuda image. Ffmpeg is not able to find libnvcuvid.so, although there are all required cuda libs.
The output of ldconfig -p | grep libnv from the container:
libnvrtc.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc.so
libnvrtc-builtins.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc-builtins.so
libnvidia-ptxjitcompiler.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-ptxjitcompiler.so.1
libnvidia-opencl.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-opencl.so.1
libnvidia-ml.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-ml.so.1
libnvidia-fatbinaryloader.so.390.12 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-fatbinaryloader.so.390.12
libnvidia-compiler.so.390.12 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-compiler.so.390.12
libnvidia-cfg.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-cfg.so.1
libnvgraph.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvgraph.so
libnvblas.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvblas.so
libnvToolsExt.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvToolsExt.so
Should I just copy libnvcuvid.so from the host? Wouldn't it break if underlying driver version changes?
I found the answer here. Just need to pass env variable ENV NVIDIA_DRIVER_CAPABILITIES video,compute,utility or -e NVIDIA_DRIVER_CAPABILITIES=compute,utility,video. Now I have all required libs ldconfig -p | grep libnv:
libnvrtc.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc.so.9.0
libnvrtc.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc.so
libnvrtc-builtins.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc-builtins.so.9.0
libnvrtc-builtins.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvrtc-builtins.so
libnvidia-ptxjitcompiler.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-ptxjitcompiler.so.1
libnvidia-opencl.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-opencl.so.1
libnvidia-ml.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-ml.so.1
libnvidia-fatbinaryloader.so.390.30 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-fatbinaryloader.so.390.30
libnvidia-encode.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-encode.so.1
libnvidia-compiler.so.390.30 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-compiler.so.390.30
libnvidia-cfg.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvidia-cfg.so.1
libnvgraph.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvgraph.so.9.0
libnvgraph.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvgraph.so
libnvcuvid.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libnvcuvid.so.1
libnvblas.so.9.0 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvblas.so.9.0
libnvblas.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvblas.so
libnvToolsExt.so.1 (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvToolsExt.so.1
libnvToolsExt.so (libc6,x86-64) => /usr/local/cuda-9.0/targets/x86_64-linux/lib/libnvToolsExt.so

Avoid linking both OpenCV2.4 and OpenCV3.0 with cmake

I have 2 opencv installation (in separate location). One is installed with apt-get (version 2.4). While another is version 3.0 installed from source code.
I notice that my catkin_make (cmake) links both opencv2.4 and opencv3.0
[ri-desktop2 robotic_vision]$ ldd devel/lib/visensor_dgem/dgem
linux-vdso.so.1 => (0x00007ffd30f5e000)
libcv_bridge.so => /opt/ros/jade/lib/libcv_bridge.so (0x00007f7eb7fa8000)
libopencv_highgui.so.2.4 => /usr/lib/x86_64-linux-gnu/libopencv_highgui.so.2.4 (0x00007f7eb7d5d000)
libopencv_core.so.2.4 => /usr/lib/x86_64-linux-gnu/libopencv_core.so.2.4 (0x00007f7eb7926000)
libroscpp.so => /opt/ros/jade/lib/libroscpp.so (0x00007f7eb75d3000)
libroscpp_serialization.so => /opt/ros/jade/lib/libroscpp_serialization.so (0x00007f7eb73d0000)
librosconsole.so => /opt/ros/jade/lib/librosconsole.so (0x00007f7eb71a7000)
librostime.so => /opt/ros/jade/lib/librostime.so (0x00007f7eb6f7d000)
libboost_system.so.1.54.0 => /usr/lib/x86_64-linux-gnu/libboost_system.so.1.54.0 (0x00007f7eb6d79000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f7eb6b5b000)
libopencv_core.so.3.0 => /usr/local/lib/libopencv_core.so.3.0 (0x00007f7eb5b1b000)
libopencv_highgui.so.3.0 => /usr/local/lib/libopencv_highgui.so.3.0 (0x00007f7eb58dd000)
libopencv_imgproc.so.3.0 => /usr/local/lib/libopencv_imgproc.so.3.0 (0x00007f7eb4941000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f7eb463d000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f7eb4427000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f7eb4062000)
libopencv_imgproc.so.2.4 => /usr/lib/x86_64-linux-gnu/libopencv_imgproc.so.2.4 (0x00007f7eb3bd2000)
libGL.so.1 => /usr/lib/nvidia-352/libGL.so.1 (0x00007f7eb38a2000)
libjpeg.so.8 => /usr/lib/x86_64-linux-gnu/libjpeg.so.8 (0x00007f7eb364d000)
libpng12.so.0 => /lib/x86_64-linux-gnu/libpng12.so.0 (0x00007f7eb3427000)
libtiff.so.5 => /usr/lib/x86_64-linux-gnu/libtiff.so.5 (0x00007f7eb31b5000)
libjasper.so.1 => /usr/lib/x86_64-linux-gnu/libjasper.so.1 (0x00007f7eb2f5e000)
libIlmImf.so.6 => /usr/lib/x86_64-linux-gnu/libIlmImf.so.6 (0x00007f7eb2caf000)
libHalf.so.6 => /usr/lib/x86_64-linux-gnu/libHalf.so.6 (0x00007f7eb2a6c000)
libgtk-x11-2.0.so.0 => /usr/lib/x86_64-linux-gnu/libgtk-x11-2.0.so.0 (0x00007f7eb242f000)
libgdk-x11-2.0.so.0 => /usr/lib/x86_64-linux-gnu/libgdk-x11-2.0.so.0 (0x00007f7eb217c000)
libgobject-2.0.so.0 => /usr/lib/x86_64-linux-gnu/libgobject-2.0.so.0 (0x00007f7eb1f2b000)
libglib-2.0.so.0 => /lib/x86_64-linux-gnu/libglib-2.0.so.0 (0x00007f7eb1c23000)
libgtkglext-x11-1.0.so.0 => /usr/lib/libgtkglext-x11-1.0.so.0 (0x00007f7eb1a1f000)
libgdkglext-x11-1.0.so.0 => /usr/lib/libgdkglext-x11-1.0.so.0 (0x00007f7eb17bb000)
libdc1394.so.22 => /usr/lib/x86_64-linux-gnu/libdc1394.so.22 (0x00007f7eb1547000)
libv4l1.so.0 => /usr/lib/x86_64-linux-gnu/libv4l1.so.0 (0x00007f7eb1341000)
libavcodec.so.54 => /usr/lib/x86_64-linux-gnu/libavcodec.so.54 (0x00007f7eb05ed000)
libavformat.so.54 => /usr/lib/x86_64-linux-gnu/libavformat.so.54 (0x00007f7eb02cb000)
libavutil.so.52 => /usr/lib/x86_64-linux-gnu/libavutil.so.52 (0x00007f7eb00a6000)
libswscale.so.2 => /usr/lib/x86_64-linux-gnu/libswscale.so.2 (0x00007f7eafe5f000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f7eafb59000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f7eaf940000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f7eaf738000)
libtbb.so.2 => /usr/lib/libtbb.so.2 (0x00007f7eaf504000)
libxmlrpcpp.so => /opt/ros/jade/lib/libxmlrpcpp.so (0x00007f7eaf2e6000)
libcpp_common.so => /opt/ros/jade/lib/libcpp_common.so (0x00007f7eaf0de000)
libboost_thread.so.1.54.0 => /usr/lib/x86_64-linux-gnu/libboost_thread.so.1.54.0 (0x00007f7eaeec8000)
libboost_filesystem.so.1.54.0 => /usr/lib/x86_64-linux-gnu/libboost_filesystem.so.1.54.0 (0x00007f7eaecb2000)
librosconsole_log4cxx.so => /opt/ros/jade/lib/librosconsole_log4cxx.so (0x00007f7eaea9e000)
librosconsole_backend_interface.so => /opt/ros/jade/lib/librosconsole_backend_interface.so (0x00007f7eae89c000)
My CMakeLists.txt
cmake_minimum_required(VERSION 2.8.3)
project(visensor_node)
find_package(catkin REQUIRED COMPONENTS
roscpp
message_generation
geometry_msgs
sensor_msgs
cv_bridge
std_msgs
image_transport
camera_info_manager
dynamic_reconfigure
cmake_modules
)
# check libvisensor version, flags not used later
find_package(libvisensor 1.1.0 REQUIRED)
add_message_files(
DIRECTORY msg
FILES visensor_imu.msg
visensor_time_host.msg
visensor_calibration.msg
)
add_service_files(
FILES
visensor_calibration_service.srv
)
generate_messages(DEPENDENCIES geometry_msgs)
include_directories(include ${catkin_INCLUDE_DIRS} ${libvisensor_INCLUDE_DIRS})
find_package(Eigen3 REQUIRED)
include_directories(${EIGEN_INCLUDE_DIR})
add_definitions(${EIGEN_DEFINITIONS})
find_package(OpenCV 3.0 REQUIRED COMPONENTS core highgui imgproc)
generate_dynamic_reconfigure_options(cfg/visensor_node.cfg)
if(NOT DEFINED CMAKE_BUILD_TYPE)
set(CMAKE_BUILD_TYPE Release)
endif(NOT DEFINED CMAKE_BUILD_TYPE)
SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -march=native -Wall -std=c++0x -D__STRICT_ANSI__")
catkin_package(
INCLUDE_DIRS include ${catkin_INCLUDE_DIRS}
CATKIN_DEPENDS
roscpp
sensor_msgs
cv_bridge
std_msgs
image_transport
camera_info_manager
)
#build and add libvisensor system library dependency
add_executable(visensor_node src/visensor_node.cpp src/visensor.cpp )
add_executable(custom_node src/custom_node.cpp include/custom_node.h)
add_dependencies(visensor_node ${${PROJECT_NAME}_EXPORTED_TARGETS}})
add_dependencies(custom_node ${${PROJECT_NAME}_EXPORTED_TARGETS}})
target_link_libraries(visensor_node ${libvisensor_LIBRARIES} ${catkin_LIBRARIES} ${OpenCV_LIBRARIES})
target_link_libraries(custom_node ${libvisensor_LIBRARIES} ${catkin_LIBRARIES} ${OpenCV_LIBRARIES})
What could possibly be going wrong.?
I recently posted this as an answer here
Libraries can be included and linked with include_directories() and link_directories() like this:
cmake_minimum_required(VERSION 2.4.6)
include($ENV{ROS_ROOT}/core/rosbuild/rosbuild.cmake)
...
include_directories(${PROJECT_SOURCE_DIR})
include_directories(/path/to/opencv/OpenCV-2.4.1/include)
link_directories(/path/to/opencv/OpenCV-2.4.1/lib)
...
# Set the build type. Options are:
# Coverage : w/ debug symbols, w/o op ...
and remove the std. linking to your current OpenCV libs.

gstreamer plugin library not linking against opencv shared object library - “undefined symbol” on CentOS

I have a plugin that uses OpenCV and it works perfectly on Kubuntu. Now I am trying to run the code on CentOS but when I run the pipeline, I get:
$ gst-launch videotestsrc ! opencvelement ! ximagesink
WARNING: erroneous pipeline: no element "opencvelement"
When I run ldd in Kubuntu, I get:
$ ldd .libs/libOPENCVELEMENT.so
linux-gate.so.1 => (0x0060f000)
libgstbase-0.10.so.0 => /usr/lib/libgstbase-0.10.so.0 (0x00a74000)
libgstreamer-0.10.so.0 => /usr/lib/libgstreamer-0.10.so.0 (0x00474000)
libgobject-2.0.so.0 => /usr/lib/i386-linux-gnu/libgobject-2.0.so.0 (0x006a2000)
libglib-2.0.so.0 => /lib/i386-linux-gnu/libglib-2.0.so.0 (0x00110000)
libopencv_core.so.2.3 => /usr/local/lib/libopencv_core.so.2.3 (0x00730000)
libstdc++.so.6 => /usr/lib/i386-linux-gnu/libstdc++.so.6 (0x00209000)
libm.so.6 => /lib/i386-linux-gnu/libm.so.6 (0x002f4000)
libc.so.6 => /lib/i386-linux-gnu/libc.so.6 (0x00acd000)
libgcc_s.so.1 => /lib/i386-linux-gnu/libgcc_s.so.1 (0x0031e000)
libpthread.so.0 => /lib/i386-linux-gnu/libpthread.so.0 (0x0033c000)
libgthread-2.0.so.0 => /usr/lib/i386-linux-gnu/libgthread-2.0.so.0 (0x00357000)
libgmodule-2.0.so.0 => /usr/lib/i386-linux-gnu/libgmodule-2.0.so.0 (0x0035d000)
libxml2.so.2 => /usr/lib/libxml2.so.2 (0x00c9c000)
librt.so.1 => /lib/i386-linux-gnu/librt.so.1 (0x00362000)
libdl.so.2 => /lib/i386-linux-gnu/libdl.so.2 (0x0036b000)
libffi.so.6 => /usr/lib/i386-linux-gnu/libffi.so.6 (0x00370000)
libpcre.so.3 => /lib/i386-linux-gnu/libpcre.so.3 (0x00e52000)
libz.so.1 => /lib/i386-linux-gnu/libz.so.1 (0x00377000)
/lib/ld-linux.so.2 (0x00a10000)
But when I run it on CentOS, I don't see opencv
$ ldd .libs/libOPENCVELEMENT.so
linux-vdso.so.1 => (0x00007fff7a1fd000)
libgstvideo-0.10.so.0 => /usr/lib64/libgstvideo-0.10.so.0 (0x00002ba0ac9b7000)
libgstcontroller-0.10.so.0 => /usr/lib64/libgstcontroller-0.10.so.0 (0x00002ba0acbc4000)
libgstbase-0.10.so.0 => /usr/lib64/libgstbase-0.10.so.0 (0x00002ba0acdeb000)
libgstreamer-0.10.so.0 => /usr/lib64/libgstreamer-0.10.so.0 (0x00002ba0ad03c000)
libgobject-2.0.so.0 => /lib64/libgobject-2.0.so.0 (0x00002ba0ad326000)
libgmodule-2.0.so.0 => /lib64/libgmodule-2.0.so.0 (0x00002ba0ad569000)
libgthread-2.0.so.0 => /lib64/libgthread-2.0.so.0 (0x00002ba0ad76c000)
librt.so.1 => /lib64/librt.so.1 (0x00002ba0ad970000)
libxml2.so.2 => /usr/lib64/libxml2.so.2 (0x00002ba0adb7a000)
libz.so.1 => /lib64/libz.so.1 (0x00002ba0adeb7000)
libglib-2.0.so.0 => /lib64/libglib-2.0.so.0 (0x00002ba0ae0cb000)
libstdc++.so.6 => /usr/lib64/libstdc++.so.6 (0x00002ba0ae3a8000)
libm.so.6 => /lib64/libm.so.6 (0x00002ba0ae6a8000)
libc.so.6 => /lib64/libc.so.6 (0x00002ba0ae92b000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00002ba0aec83000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00002ba0aee91000)
libdl.so.2 => /lib64/libdl.so.2 (0x00002ba0af0ac000)
/lib64/ld-linux-x86-64.so.2 (0x0000003f6c800000)
Here is my makefile.am, which is the same on both OS
plugin_LTLIBRARIES = libOPENCVELEMENT.la
# Plugin source files:
libOPENCVELEMENT_la_SOURCES = opencv_chain.c opencv_chain.h datasetup.h
libOPENCVELEMENT_la_CFLAGS = \
$(GST_PLUGINS_BASE_CFLAGS) \
$(GST_CFLAGS) \
$(OPENCV_CFLAGS)
libOPENCVELEMENT_la_CXXFLAGS = \
$(GST_PLUGINS_BASE_CFLAGS) \
$(GST_CFLAGS) \
$(OPENCV_CFLAGS)
libOPENCVELEMENT_la_LIBADD = \
$(GST_PLUGINS_BASE_LIBS) \
$(GST_LIBS) \
$(OPENCV_LIBS) \
-lopencv_core \
-lopencv_highgui
libOPENCVELEMENT_la_LDFLAGS = $(GST_PLUGIN_LDFLAGS)
libOPENCVELEMENT_la_LIBTOOLFLAGS = --tag=disable-static
Let me know if you need more information. Any help will be appreciated. Thank you.
Do you have all the opencv development files installed on centos as well. Run a:
grep "OPENVC_" Makefile
to check what the Make variables contain. Also you can pipe the linker (undefined symbol) output through c++filt to see the real function name instead of the mangled name.
I was able to solve this by going to /usr/lib64/pkgconfig and modified opencv.pc to explicitly have all libraries. I also had to move the plugins from /usr/lib/gstreamer-0.10 to /usr/lib64/gstreamer-0.10

Resources