I am unable to open a camera on the network using Catalina, Python 3.7 and OpenCV 4.1.2.
I am running an IP Webcam app on a phone that exposes an endpoint as: http://192.168.87.26:8080/video. The following command fails:
import cv2
cap = cv2.VideoCapture('http://192.168.87.26:8080/video')
and the error message is:
OpenCV: Couldn't read video stream from file "http://192.168.87.26:8080/video"
At the same time, a video mp4 file works well. I have added permissions in MacOS such that the default webcam also works.
I have tried with both the pip install opencv-python as well as built an opencv from source, but the error for the video stream does not go away.
FFMPEG is installed in the system. FFPLAY on this URL http://192.168.87.26:8080/video works very well.
$ brew info ffmpeg
ffmpeg: stable 4.2.1 (bottled), HEAD
Play, record, convert, and stream audio and video
https://ffmpeg.org/
/usr/local/Cellar/ffmpeg/4.2.1_2 (287 files, 56.6MB) *
What I am missing?
After some more digging around, I was able to make it work. Looks like I had an old version of Intel OpenVINO around from Mojave days that was interfering with any local version of OpenCV that I would install.
During the exercise, I also figured that building OpenCV from scratch is far better than installing it from pip.
Related
I'm using hardware encoding via FFMPEG and OpenMAX.
If I'm using FFmpeg h264_omx as a backend for VideoWriter than I get image like this. Colors obviously kind of mixed.
Other codecs work fine. If I'll just re-encode via the FFMpeg command line, the output will be okay.
What is the workaround and where should I look?
Machine: Raspberry Pi 4B+
System: Custom Yocto Distribution (master branches)
OpenCV version: 4.1.0
FFMpeg version: 4.2.2 (build with --enable-omx and --enable-omx-rpi)
I have the same issue with encoding h264 through rtmp...
I believe this is a pixel padding/alignment issue, but I do not know where the issue arises.
I have installed OpenCV on Ubuntu 14 , when i want to feed it's function with a video , it told me to install ffmpeg libraries but as i know, ffmpeg is replaced with another decoder library on Ubuntu 14, How can i fix it or use another library ?
Firstly, I've installed ffmpeg using
sudo port install ffmpeg
on my Macbook OSX 10.9 and XCode 5.1. I've done the same for OpenCV
sudo port install opencv
and I got face detect working using this SO answer. However, when trying to open a video file in the source code folder using VideoCapture I get the error "WARNING: Couldn't read movie file Alireza_Day1_001.avi". Has anyone faced the same issue? (FYI VideoCapture from my webcam is working fine, but tried opening a .mov and .avi file without luck) Any help is much appreciated!
I have started to work on Rpi. I have worked on opencv on windows and ubuntu. Now I want to do Image porcessing on Rpi. I have installed latest version 2.4.8 on my Rpi. And I am able to open and display an image. However when I am trying to open webcam and display, it gives me error: HIGHGUI ERROR: v4l/v4l2: VIDIOC_CROPCAP.
Can anyone tell whats the problem ?
I haved the some error when trying to execute a code that does facedetection , i solved it by adding the files needed in my working folder (source folder).
you can show us the code to find out what exactly you need
If you are using prebuilt binaries, they may be built without v4l support. Try to build your own OpenCV with v4l checked. And of course before that install v4l with development files. You can follow this tutorial with how to install required dependicies explanied.
I have a problem trying to get a webcam in OpenCV 2.3 to work with Cygwin. I have a Windows 7 64-bit system and I use Notepad++ and cygwin compilers to do my C++ programming. I have seen other posts with similar problems:
Getting Webcam to work in OpenCV
Can't access webcam with OpenCV
I first tried installing OpenCV via the Cygwin Ports. This is rather easy to install, but alas I run into the webcam problem (always returns false when trying to find a device). I have also attempted to build OpenCV and install manually using the command line flavor of CMake. I tried adding the HAVE_VIDEOINPUT and HAVE_DSHOW flags, but no dice. All my programs compile nicely and I have all the functionality of OpenCV aside for this webcam thing.
Has anyone successfully built OpenCV 2.3 on Cygwin with webcam working?
It is unlikely that cygwin build of OpenCV will be able to access a webcam. At least it is not possible without hacking OpenCV cmake scripts. Under cygwin OpenCV build always follows the UNIX branch and videoinput/directshow is excluded from build.