IP Camera with opencv: error ffmpeg - opencv

i am using VIVOTEK IP camera. I am trying to interface it with OPENCV. internet explorer shows fine video at this url, after entering username and password.
the code is given below
const std::string videoStreamAddress ="http://192.168.100.128/main.html";
//i have also tried "http://username:pasword#192.168.100.128/main.html" but the same
//result
//and also tried ""http://192.168.100.128" i.e without "main.html"
if(!vcap.open(videoStreamAddress))
{
std::cout << "Error opening video stream or file" << std::endl;
}
I got the following error
warning: Error openong file <../../modules/highgui/src/cap_ffmpeg_impl.hpp:529>
Error opening video stream or file
what can be the problem?

The URL which you have given is the problem. You can use url something like this
"http://username:password#ipOfCamera/axis-cgi/mjpg/video.cgi?resolution=640x480&req_fps=30&.mjpg"
Or another option is to download iSpy software and use IP camera wizard where it finds the URL for you and gives the best choice for the camera you are using. I did use this approach.
Heres the code which worked for me. as far as you want to get the live feed from the IP Camera.
Here's the list of URL which can be used to get the video from your IP Camera..

Related

find and share a downloaded video on Flutter ios without going through picker?

I have a Flutter app that can view mp4 files from a URL. (Using a video controller playing directly from the URL.) I want the user to be able to share them if they wish. As best I can tell the file has to actually exist on the device so I have broken down the steps for now into download file, invoke share.
I'm using this guide: https://retroportalstudio.medium.com/saving-files-to-application-folder-and-gallery-in-flutter-e9be2ebee92a
I need to work on ios and android. The problem is that on ios neither the filename I get from the dio downloader nor the ImageGallerySaver seem to "work" when passed to the system ShareSheet.
I'm using the flutter extensions dio, share_plus, cross_file, image_gallery_saver as I've seen recommended in various places.
File saveFile = File(directory.path + "/$fileName");
developer.log("starting download...");
await dio.download(url, saveFile.path,
onReceiveProgress: (value1, value2) {
developer.log("got progress " + value1.toString());
setState(() {
downloadProgress = value1 / value2;
});
});
_permaFile = saveFile.path;
if (Platform.isIOS) {
var galleryResult = await ImageGallerySaver.saveFile(saveFile.path,
isReturnPathOfIOS: true);
developer.log("gallery save result = " + galleryResult.toString());
_permaFile = galleryResult['filePath'];
}
After getting a directory we use dio to download the file, do some log chirping, and then save the name to an object member called _permaFile.
Then the share button triggers:
void _shareAction() async {
final box = context.findRenderObject() as RenderBox?;
final files = <XFile>[];
if (_permaFile == null) {
return;
}
developer.log("sharing file: " + _permaFile.toString());
files.add(XFile(_permaFile!));
await Share.shareXFiles(files,
text: "Event",
// subject: "Subject for Event",
sharePositionOrigin: box!.localToGlobal(Offset.zero) & box.size);
}
This works on android device... after I download I hit share, and I can share the video to a third-party app like WhatsApp.
On ios the ShareSheet is invoked but when I share I only get the text "Event", not the video file that goes along with it.
Note that I have tried both results... setting the _permaFile to be what comes back from ImageGallerySaver but also just using what the dio downloader gives back.
Note also that the ImageGallerySaver seems to work: the video really does land and is there in the ios video lib. If I go into the Photos app I can share from there to WhatsApp and have the video get sent.
In each case I get errors like this:
[ShareSheet] error fetching item for URL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:/// : (null)
[ShareSheet] error fetching file provider domain for URL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:/// : (null)
[ShareSheet] error loading metadata for
documentURL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 --
file:/// error:Error Domain=NSFileProviderInternalErrorDomain Code=0
"No valid file provider found from URL
file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:///."
UserInfo={NSLocalizedDescription=No valid file provider found from URL
file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:///.}
In order to test this further I built the share_plus demo app:
https://github.com/fluttercommunity/plus_plugins/tree/main/packages/share_plus/share_plus
I modified it to share videos to see what was different. The share plus example (sp_example) works for sharing videos that have been selected by the picker.
For this reason I think the problem is something I'm missing about ios video filenames/formats and possibly a built-in conversion step that happens.
Here are what the filenames look like that I see in my app:
dio download result:
file:///var/mobile/Containers/Data/Application/223BF2B9-DDF0-490E-932F-09D5F03B98B3/Library/Caches/test.mp4
ImageGallerySaver result:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0019.MP4
This is what video filenames look like when they are picked and shared in sp_example:
/private/var/mobile/Containers/Data/Application/E5CB4D7C-6CDF-4AA2-8134-C4322ED7C886/tmp/trim.E6633D68-44E3-4853-A29E-A71AC95A0913.MOV
Note that it has been converted to MOV extension and the user gets trim step right in the picker that results in trim in the name.
For my purposes I don't want to go through the picker, the user is on the screen showing the video and they shouldnt have to repick, so where do I get the post-conversion ios filename that references what I just saved?

VLC is able to connect rtsp, but openCV and ffmpeg

I am trying to access live video stream via OpenCV. Firstly, I am typing the rtsp url in VLC and I can see the video without any problem
However, when I put the same rtsp url into my python code and ffmpeg, it is not able to catch any information.
The python code is very easy as following and I always got "error" print message at very first time.
import cv2
vcap = cv2.VideoCapture('rtsp://XX.XX.XX.XX/mystream')
while True:
ret, frame = vcap.read()
if ret:
cv2.imshow("test", frame)
cv2.waitKey(1)
else:
print("error")
break
and also, the ffmpeg part I just simply type following command to test is there any information I can access or not. But it is still not working.
ffmpeg -i rtsp://XX.XX.XX.XX/mystream
The codec I got from the VLC as the picture below:
Does anyone have the same problem with me? and how to figure it out?
Thank you.

OpenCV on Ubuntu: I can't use video capture to read video files

I am trying to VideoCapture video files using the following code:
VideoCapture cap("input.avi");
if(!cap.isOpened()){
cout << "Cannot open the video" << endl;
return-1;
}
However the output is always "Cannot open the video", meaning that I can't read this video. (I am sure I put the video file in a correct deritory)
But I am able to VideoCapture from a real-time camera successfully using
VideoCapture cap(0);
Why this happens? How can I fix it because I really need to read video files instead of real-time camera.
THX!

Flex/Flash Builder/Actionscript/AIR/Mobile iOS How to take video using the camera and/or browse for & view/access video stored in the 'Camera Roll"

My understanding currently is that:
CameraUI
I can use the CameraUI to access the built in camera for MediaType.VIDEO and that delegates to the built-in video camera app and lets me record a video. My app does that now.
When I stop recording and click the "Use" button, I am returned to my app and theoretically I have a valid MediaPromise.
iOS does -not- provide a valid/usable url/filename to the recorded video (or to photos) and so I would have to use a Loader to bring-in/use/access the 'recorded' video... AND... iOS does not actually create a file anywhere on the device, most importantly, in the Camera Roll where one would expect by the normal behavior when uses the system native camera/video app.
The documentation says that the Loader can load various image types and SWFs but nothing about video data, so I conclude from that that I cannot actually use the CameraUI to generate a valid MediaPromise that I can then pass to a Loader class or similar to read in the information created by the system camera and then manipulate (upload, save to applicationStorageDirectory, and/or display in one of the two video player components available in the API).
CameraRoll
I can have video entities in the iOS Camera Roll but the AS3/Air3.5 CameraRoll class won't let me view/access/reference them in any way.
Normal File I/O
All my attempts to use the Air3.5 File classes to browse to the storage location of the iOS Camera Roll have been rebuffed.
------- Questions -------
Am I correct in believing that there is a way to take video but no way to use the video that's been captured. (No way to use the resulting MediaPromise successfully).
I believe you can take video and access it using Android, but there's nothing in the documentation that says that you cannot using iOS.
Am I correct in believing that iOS sandboxes apps so that they cannot browse to video/photo storage using standard File I/O, but only through the apparently non-workable means I've tried (CameraUI & CameraRoll)
Am I wrong to think that these should be rather obvious NEEDS that one can achieve using the XCode Objective C++ etc route but the AIR Mobile Framework does not allow either because of Apple blocking functionality or because Adobe has failed to meet reasonable expectations?
One item of ironic note to convey. If I use the iOS system camera app to record a video, a thumnail of that video then appears in the Gallery/Camera Roll, and of course, I can share it or view it, or whatever... If I use AIR's CameraRoll.browseForImage(), provided I haven't used the camera to take another image, when it shows me the folder where the pictures are stored, the folder icon uses the thumbnail of the last object added... in this case, the video I took, but if I then enter the folder, the video cannot be found. It's teasing us. It knows it's there, but it is apparently forbidden fruit.
I can't answer all your questions, so this entry may not be acceptable, but I found this page while searching a solution for some the problems you described and thought that someone else may find this answer (partially) useful.
To save the movie you just took you need to open and read the data from the promise.
The iOS won't save the file anywere, so the MediaPromise.file is always null.
This is my solution to the problem:
private var camera:CameraUI;
private var dataInput:IDataInput;
public function recordVideo():void
{
// Start the camera and ask for a video
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, onCameraComplete);
camera.launch(MediaType.VIDEO);
}
private function onCameraComplete(event:MediaEvent):void
{
// event.data is a MediaPromise and MediaPromise.open() returns a IDataInput
// Let's cast it to a dispatcher and check when it's complete
dataInput = event.data.open();
var dispatcher:IEventDispatcher = IEventDispatcher(dataInput);
dispatcher.addEventListener(Event.COMPLETE, onDataInputComplete);
}
private function onDataInputComplete(event:Event):void
{
// We can do whatever we want with the data, so we'll store it in a File
var file:File = new File();
var bytes:ByteArray = new ByteArray();
var stream:FileStream = new FileStream();
// Reading the data from the opened MediaPromise
dataInput.readBytes(bytes);
stream.open(file, FileMode.WRITE);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
}
Also, I'm still looking for a way to put the movie in the CameraRoll

Error while capturing video via RTSP from a network camera using OpenCV and FFMPEG

I am using OpenCV and FFMPEG to capture frames from a network camera using RTSP. The point is that OpenCV successfully loads the FFMPEG .dll but icvCreateFileCapture_FFMPEG_p returns false in the following code of cap_ffmpeg.cpp:
virtual bool open( const char* filename )
{
close();
icvInitFFMPEG();
if( !icvCreateFileCapture_FFMPEG_p )
return false;
ffmpegCapture = icvCreateFileCapture_FFMPEG_p( filename );
return ffmpegCapture != 0;
}
Probably the stream is not ready or you have problems with the network address/access.
Check this if you have followed the correct way of doing it. Try pining the network resource first and see if it is available or not. The camera also must allow un-authenticated access, set via its web interface. Sometimes MJPEG works and MPEG4 has problems.

Resources