How does tvtak work for real-time content recognition for TV broadcast channel? - image-processing

tvtak is a platform for TV content recognition.
It can auto-recognize real-time broadcast and offline advertising video.
Tvtak's core technology should be the real-time image matching between front-end image(from tv viewer's phone camera)
and backend frame images(from real-time capturing broadcast frame).
The question is :
1. How tvtak can get the real-time broadcast channels stream?. We know the channels are encrypted by cable operator! Does tvtak need to corporate with cable operator? Or do they get the video from some free internet broadcast stream?
2. What may be the matching algorithm for tvtak?
3. How do tvtak get the electronic program guideline (EPG) for all channels?

http://www.tvtak.com/developers.html says they take the real-time streams and index them on-the-fly:
Live TV -In the back-office, TvTak indexes in real-time broadcast TV
channels in multiple countries. Video is not recorded but only
analyzed in real-time to produce the reference matching identifiers.
Cues for Pre-recorded Clips – for ad spots, movie trailers, or any other pre-recorded content, reference IDs can be generated in advance.
I doubt anyone will tell you the algorithm
Why do they need the EPG? They have the live streams, which include things like "programme name" as meta-data (I assume!)

Related

Adaptive Bitrate Streaming (ABR)

How does an "Adaptive Bit Rate Streaming" works? For instance, how does Netflix or Youtube manages to continue playing the video from that very timestamp with a different resolution? How do they get to know about the bandwidth or network speed of any client? What if a particular video resolution was not available at the client's nearest OCA or CDN?

WebRTC videochat p2p: Switch from local stream to p2p stream

I want to establish a p2p video chat using WebRTC.
This is meant for a "doctor-patient" 1-on-1 video chat.
The video conference should start at a certain date/time.
However, both parties should be able to already join the room, but not see each other. They should be able to adjust their camera in private.
How could this be achieved?
I'm absolutely not sure which way to go here.
Could I perhaps switch from local stream to p2p stream at that certain start date / time of the appointment?
Thank you.

broadcast live audio to mobile devices with lowest posible latency

I need to send audio of a translator(person) in real time (or as close as posible) to mobile phones so attendees can "tune in".
I looked at Icecast server (was too laggy), WebRTC(IOS 5 needs to be supported) and am now looking into node.js with binary streaming. I also considered an FM Transmitter, but Iphones don't have FM Radio.
I can capture the audio data and send it to server and transcode in realtime (MLT Framework) Just wondering what the best course of action would be? I really would like to use a browser as delivery platform.
Thanks in advance
Chris

Library or protocol to allow streaming music across LAN?

are there any libraries or open protocols to allow for sending music bitstreams across multiple machines in a LAN?
If you want everyone to be listening to the same content like a radio station, try streaming from a media server. Then create a page with an embedded player for a channel on your media server. Then publish the page.
Be prepared for high data usage as each connected user will increase data consumption by 1 x the bitrate.

How to use HTTP Live Streaming protocol in iPhone SDK 3.0

I have developed on IPhone application and submitted to App store. But my application got rejected based on below criteria.
Thank you for submitting your yyyyyyyy
application. We have reviewed your
application and have determined that
it cannot be posted to the App Store
at this time because it is not using
the HTTP Live Streaming protocol to
broadcast streaming video. HTTP Live
Streaming is required when streaming
video feeds over the cellular network,
in order to have an optimal user
experience and utilize cellular best
practices. This protocol automatically
determines bandwidth available to
users and adjusts the bandwidth
appropriately, even as bandwidth
streams change. This allows you the
flexibility to have as many streams as
you like, as long as 64 kbps is set as
the baseline feed.
In my apps I have to stream prerecorded m4v and mp3 files from my server. I used MPMoviePlayerController to stream and play those videos / audio.
How to implement the HTTP Live Streaming Protocol in my apps? Also can I get some sample code?
Thanks in advance!
There are many documents about Apple's HTTP Live Streaming:
HTTP Live Streaming Overview
IETF HTTP Live Streaming Internet-Draft
There are many encoder devices claim to support this protocol e.g.,
Inlet's Spinnaker, acquired by Cisco and renamed to Cisco Media Processor Family.
For a software solution, please give a visit to Wowza
Please check the below notes specified in Apple documentation.
****Important: iPhone and iPad apps that send large amounts of audio or video data over cellular networks are required to use HTTP Live Streaming.****

Resources