Streaming video frames to web page - opencv

I am working on a project for streaming real-time video to a web page. I read the video frames one by one with OpenCV on a Raspberry Pi, then send the compressed frames via TCP socket to a web server (I am using Flask), which hosts the web page.
For now I tried streaming the frames to the web page with HTTP multipart response, it is looking good and it is used in some projects posted online but I am not sure that HTTP is the best choice.
My other idea is to transmit the frames to the web page via websocket, because the HTTP is usually slower and has bigger overhead, but I did not find any similar use case of websockets.
So I am curious if websockets could be the better choice than the HTTP multipart response.
Has somebody tried using websockets for video streaming? Or does somebody have a good argument if the websockets are going to bring better performance?

Related

Checking for WebRTC connectivity - reliable methods

I have a live video chat application and I use a TURN server which supports STUN/TURN and both UPD/TCP transmission.
Sometimes users can be connected to the network which blocks that much ports and protocols that WebRTC connection just cannot happen (usually those are corporate networks). I would like to check if a WebRTC connection is possible before users try to connect to each other (actually, perform a technical check).
How can I do it? Ideas I have in my head:
Try to download a hosted chunk of data (audio file, for example) via WebRTC - is it possible and would this be enough to make sure both inbound and outbound connections are open?
Use a TURN server as a host to make a connection to and see if it fails (have no idea if I can do it or not)
Use Flash to try to download/upload a chunk of data over specific ports and protocols. May be even using Cirrus. However, I am not sure this test will be accurate from WebRTC prospective.
Any other ideas?
Additional requirement: the checking technique must support Chrome, Opera and Firefox. Preferably also IE/Safari via Temasys plugin.
Edition 1 - gathering ICE candidates is a good idea, however, it is not 100% reliable. Once I checked logs in my application and it actually gathered relay ICE candidates, but video/audio transmission failed. Tested on Apprtc as well and got same results.
The best way to check is to connect with just a data channel first. Your users won't notice. If that works then audio and video are almost guaranteed to work. As a bonus, you can use the data channel for signaling for super-fast connecting when your users are ready.
the typical WebRTC approach to this is to create a peerconnection with STUN and TURN servers, call createOffer and setLocalDescription and watch the candidates gathered. See e.g. http://webrtc.github.io/samples/src/content/peerconnection/trickle-ice/
If you get srflx candidates, your stun server works (i.e. UDP is not blocked). More interesting is whether you get relay candidates. If you do, using TURN as a fallback will work. Quality might suffer if TURN/TCP is used. If you don't get relay candidates... calls are very unlikely to work.

NSURLConnection vs NSStream for rapid server communication

Let's say we have an app that displays some kind of dashboard. This dashboard however should be updated extremely often(say at every 500ms). I'm familiar with long pull requests and know how I could implement them with NSURLConnection in some background thread. However it seems this will lead to two big problems - request/response concurrency and overhead of long pull requests at such short time intervals. Although first problem can be solved with some techniques, I think such frequent requests to a server is a general problem.
So after some research I found NSStream class, and it's descedants NSInputStream & NSOutputStream. My idea is to make connection to server and keep it alive for the whole time. And just at 500ms intervals to send GET request at output stream and read data from the input stream.
So here are my questions:
Am I on the right track for implementing this?
Should the server be prepared on some special way of dealing with this kind of connections(I mean won't it drop the connection after some timeout)?
Is there real benefit of skipping connection establishing to improve app performance and to lower refresh time at the dashboard?
UPDATE
I've implemented classic way. When I hit the method for requesting if previous request not yet finished I'm cancelling it. So basically I've only one active connection at a time to prevent concurrency. Also if I didn't receive response for 500ms I do not need this response at all, as it will be outdated anyway. I'm accomplishing pretty neat results in both Wi-Fi and 3G. As I expected on edge there is dropped response every 3 to 4 requests.
Still wondering however about the streams. I did try to follow this apple ref, but when I send HTPP GET via output stream, my input stream return 403 Forbidden from the server. This could be entirely server problem, however I'm not sure if this is the right track and whether it's worthy to change server side.
Q1) Am I on the right track for implementing this?
A) I'd suggest WebSockets
Q2)Should the server be prepared on some special way of dealing with
this kind of connections(I mean won't it drop the connection after
some timeout)?
A)Even though you could try Configuring
Persistent(Keep-Alive)Connections on webserver to do it easily
I'd suggest WebSockets
Q3)Is there real benefit of skipping connection establishing to
improve app performance and to lower refresh time at the dashboard?
A)Yes,Connection opening and closing are costly process that's why
there are Keep-alive connection and Google also introduced SPDY
for Webapps.so Sockets would solve this problem for you.
WebSockets
is good way to go.
Frequent polling is not a way to go because you contact the server very frequently 0.5 seconds
WebSocket provides full-duplex communication.Additionally, WebSocket enables streams of messages on top of TCP. TCP alone deals with streams of bytes with no inherent concept of a message
The WebSocket protocol was standardized by the IETF as RFC 6455 in 2011, and the WebSocket API in Web IDL is being standardized by the W3C
WebSocket is designed to be implemented in web browsers and web servers, but it can be used by any client or server application. The WebSocket Protocol is an independent TCP-based protocol. Its only relationship to HTTP is that its handshake is interpreted by HTTP servers as an Upgrade request. The WebSocket protocol makes more interaction between a browser and a website possible, facilitating live content and the creation of real-time games. This is made possible by providing a standardized way for the server to send content to the browser without being solicited by the client, and allowing for messages to be passed back and forth while keeping the connection open. In this way a two-way (bi-directional) ongoing conversation can take place between a browser and the server
You can find more about WebSockets here
Here are some good WebSocket client libraries on Objective C
SocketRocket and
UnittWebSocketClient
Note:
These libraries use NSStream
Hope this helps
As long as your server is HTTP server, server disconnect you after returning result.
So, if you want to keep connection alive long enough, you must implement your own protocol based on NSStream/Socket both iOS and Server.
You may choose famous socket based protocol like WebSocket, Square's SocketRocket is famous library for iOS based on NSStream.
If your dashboard needs real time update, I think it's worth deploying NSStream/Socket based protocol.

Streaming live desktop video to a web application

I'm looking to find a way to stream a user's desktop LIVE (through some piece of software, such as Open Broadcaster Software) to a web application.
I'm assuming I should use a CDN to get the live streamed video to my web application, but how (and what software should I use) to get the user's desktop to a streaming service? Should I use a service such as Red5 or an AWS service? Or if only a few viewers are using it, should I host the service myself?
Although I have built my share of web applications, I have never dealt with live media streaming before, and I would appreciate any assistance anyone could lend.
By far the best resource for video on Rails is OpenTok
Our own demo here: http://bvc-video.herokuapp.com/broadcasts/1
--
Streaming
Video streaming is a tough one
The problem is really dependent on what you're trying to stream. If it's "live" video - I.E captured & sent directly to the viewers, you'll have to use some sort of server to process the video.
Although I don't have huge experience with this, the main issue we've found is the compression / distribution of the feed. It's actually very simple to acheive video streaming on iOS - all the software / hardware is the same (just use the same API / drivers)
This often negates the requirement for a central server, although it's highly recommended (almost required) for many cases. Problems arise when you try and beam to multiple clients on multiple systems; as you'll run into compatibility issues
--
Solutions
The solutions we've found are thus:
The most stable part of the app is to take the stream & send to a server
The wizardry will then be to beam that stream to multiple clients
The way to do this is typically to use a flash widget & pull the stream from the server
WebRTC is becoming the standard (OpenTok is built on this)
I'm not sure about video compression / distribution. Akami is an industry heavyweight, but I've never used it. Brightcove too

How to convert RTSP streaming to Http Live Streaming using lighttpd?

I'm having a problem here. I want to play RSTP streaming on ipad and iphone. but I find out that it will be much more easier if I use Http Live Streaming. I want to convert my RTSP streaming to Http live streaming using lighttpd. but I really have no idea how to do that. Do lighttpd accetp rtsp streaming url as input? Can anyone help? thanks!
you have two choices:
1) Run a server on your network that re-streams rtsp as hls.
a) wowza - popular, expensive
b) live 555 - free, lots of work
d) ffserver - free and as basic as it gets tons of work to make work.
Advantage :
No bandwidth restrictions over cellular or wi-fi
play with native apple players
Disadvantage
High server bandwidth - if your paying for server time you may want to watch this.
high letancy - forget any kind of live video.
2) Run FFmpeg based player on device
advantages :
a) A lot easier than it used to be, we do this all the time
b) deal with lgpl license, clear guidelines at ffmpeg.org and not a huge hassle
c) all on device, no server load issues.
Disadvantages
Limited bandwidth over celluar (about 10 min intervals), unlimited over wifi
lighttpd doesn't accept RTSP as an input. You will need some sort of translator program to read the rtsp stream and output the files to the website storage. I think you could do it with the avconv/ffmpeg program.

Fake video streaming

I am building an iOS App which displays video streams from a somewhat complex backend. Now while developing I want to be able to have some sort of test video stream, which I can use. Ideally this would also work without internet connection.
The video stream could show for example the current time or just a simple animation. What would be a good way of doing this on a Mac without having to install a whole suite of tools.
On you Mac you can setup a webserver or streaming server to provide you with a constant video stream for testing purposes. You won't need Internet access. You will, of course, need to ensure that the OSX firewall is either disabled or allows requests to the ports (80, most likely).
Two simple approaches I can see:
Wowza MPEG-TS stream of the Webcam on your mac
Install Wowza Media Server; developer license is free
Configure a basic applicaiton with MPEG-TS streaming
Use an encoding applicaiton, like Flash Media Live Encoder (free), Wirecast (demo version free), or some other software and start streaming from your webcam to the WMS
alternatively, with a bit more effort, you could setup Wowza to stream a file in a loop
be sure to get the codec settings correct
M3U8+MPEG-TS static files over plain HTTP
Simple Setup a basic webserver (lighttpd, Apache httpd, Apache Tomcat, whatever) to server static files
Whip up an M3U8 file to first point to a .ts media file, and then secondly back to itself
Have a look at MPEG-TS/M3U8 live stuff to work out the details. You'll need a properly segmented video file to start with.

Resources