As part of a student project, I am currently setting up device to device video streaming.
I am using two Raspberry Pi 3 with the camera modules and am adding face tracking via OpenCV (all in Python3).
I want to stream live Video captured by Raspberry Pi (X) to Raspberry Pi (Y) and vice versa. The Raspberries will not be in the same building/network.
What I don't want, is anyone being able to stream the video on a different device.
As I am new to the whole streaming and security idea, I was wondering if there is some way of adding security to live streams by limiting access by the device.
Say, the video of Raspberry Pi (X) CAN ONLY be viewed by Raspberry Pi (Y).
Is this possible? If not, what's the next most secure option (limiting by IP maybe).
I am also not fixed to using Raspberries for this project, if there is a different solution I'd love to hear about it.
Thanks for any ideas.
You're not the first person to do something like this. A raspberry pi is an excellent choice for the project and you should be able to find plenty of guides online to doing something like this.
You'll want to ensure you enable a strong username and password within whatever library you use.If you want to protect the live stream with a username and password, you should enable this:
stream_auth_method 2
stream_authentication SOMEUSERNAME:SOMEPASSWORD
https://www.instructables.com/Raspberry-Pi-as-low-cost-HD-surveillance-camera/
Related
I'm new to IP cameras and I know there are quite a lot of topics about this in the forum already, but I can't find a concrete answer for my needs.
I want to access an IP camera using OpenCV in Python from a Windows PC. As I don't have a camera yet, I need to buy one and I can't figure out, what requirements this camera needs to have.
For example, there are quite cheap IP cameras (e.g. Xi****) which say they come with an Android or iOS app and are only accessible via those.
I thought you can access any IP cam via OpenCV, but now I'm not sure anymore... can anyone give me an overview, what specs an IP cam needs, to be accessed via OpenCV on Windows? I don't want to buy a camera and later realize, that I can't access the video stream.
I'm really sorry, if this has already been asked, but I can't find a satisfying answer to this question and Google doesn't seem to be very helpful...
Thanks in advance.
check for IP cam that can transmit RTSP opencv know how to work with this type of stream.
Full disclosure: I'm a pretty junior developer and new to asking questions. I also don't know that much about video streaming as a concept so if the answer is right in front of my face I probably just glazed right over it.
That being said, I am trying to do something that seems like it should be pretty simple but can't seem to figure it out. I'm trying to get a H.264 live stream video off of a Raspberry Pi and view it in my app. I've found a number of things about encoding videos but couldn't seem to get anything to work.
Anything anyone has to offer would be a large help, even if it is just a direction to look in because I'm pulling my hair out trying to figure this one out.
You'll first need to install some platform on your Raspberry Pi that can serve data to a client. You can look into web server platforms like Apache. Once installed, you can verify this is working by hitting the IP address of the Raspberry Pi from any browser: e.g. 192.168.1.67:80
Then you need to make sure the video is available through the file system on your Raspberry Pi. Searching something like "Adding files to Apache" might help.
You can test that the file is available by hitting the IP address of your Raspberry Pi from any browser: e.g.
192.168.1.67:80/path/to/video.mp4
This means that the video file is available and can be downloaded, but won't be streamed by default. Then you can look into some JavaScript framework that can help you with the streaming portion.
Apple has super famous HLS protocol for streaming videos. You would need to first encode video input coming from camera, then pass it to your server who's basically doing all the "behind the scene" work and provides you with *.m3u8 URL. I've implemented this pattern with Wowza Streaming Engine. You can use it or similar tools.
On the flip side, if you're inclined towards having more simple and straight forward solution; more like a CDN approach, then you may follow #Bret's answer.
I am hoping someone can tell me if a 2 wire camera exists. I have had no luck finding one that is relatively cheap. The 2 wires have to carry the signal and power of the camera.
Basically I want to use a device like a Raspberry Pi or Beaglebone to process images.
HOWEVER, I want to utilise existing 2-wire (shielded twisted pair) cables in the field and have the image processing device in the central location.
Anyone know of such a thing? Thanks.
Any analogue camera can be assuming that you dont require PTZ telemetry to the camera. You could also use something like this:
Balun
Is that possible or do you need to connect the kinect to a computer and stream the images in (almost) real time to an iPhone? Is it even possible to get ~30fps via stream on the iphone?
The Kinect uses a USB connection and even if you could make up some sort of cable to connect a Kinect to the Lightning or 30 pin connector, iOS would not recognise the Kinect as it does not have a driver, so the short answer is no, you cannot connect a Kinect directly to the iPhone.
For a simple solution/alternative, you might want to check out Occipital/Structure.io, who are selling a depth sensor for (some) iDevices for ca. 380USD.
Apparently they are using Primesense Carmine sensors ("which is essentially an equivalent of ASUS Xtion Live under different brand name" according to [iPiSoft's sensor comparison] (http://wiki.ipisoft.com/Depth_Sensors_Comparison)).
You can review the differences to the Kinect at the previous link, but basically it boils down to the Kinect being bigger and heavier, having a motorized tilt and requiring external power.
To get back to your question:
if you look around you'll find working examples of how to get OpenNI running on BeagleBone dev-boards under Linux, and thus it is more than conceivable that you'll be able to compile and run it for/on iOS as well (possibly requiring a jailbreak).
You could also have a look at libfreenect, another open implementation of drivers for the Primesense family of sensors (as well as the Kinect 2).
I'm writing LabVIEW software that grabs images from an IMAQ compatible GigE camera.
The problem: This is a collaborative project, so I only have intermittent access to the actual camera.I'd like to be able to keep developing this software even when the camera isn't present.
Is there a simple/fast way to create a virtual or dummy IMAQ camera in software? Ideally I'd like the dummy camera grab frames from an AVI or a stack of JPEG's. Something like this must exist, I just can't find it on Google.
I'm looking for something that won't take very long (e.g.< 2 hours effort) and that is abstracted away through the standard LabVIEW IMAQ interface, so that my software won't know or care whether its dealing with a dummy camera or an actual camera.
You can try this method using LabVIEW classes:
Hardware Emulation Using LabVIEW Classes
If you have the IMAQdx driver, you might consider just buying a cheap USB webcam for $10.
Use the IMAQdx driver (assuming you have it), and then insert the Vision Acquisition Express VI, and you can choose AVIs or even pics as a source.
Something like this: GigESim is a camera emulation software. Unfortunately it is proprietary and too expensive (>$500) for my own needs, but perhaps others will find this link useful.
Anyone know of a viable Open Source alternative?
There's an IP Camera emulator project that emulates IP camera with python. I haven't used it myself so i don't know if it can be used by IMAQ.
Let us know if it's good for you.
I know this question is really old, but hopefully this answer helps someone out.
IMAQdx also works with Windows DirectShow devices. While normally these are actual physical capture devices (think USB Webcams), there is no necessity that they have to be.
There are a few different pre-made options available on the web. I found using Open Broadcaster Studio and this Virtual Cam plugin to be easy enough. Basically:
Download and install both.
Load your media sources in the sources list.
Enable the VirtualCam stream (Tools > VirtualCam). Press Start.