I want to be able to stream live TV but also include the closed captions. My workflow is to use a Blackmagic Designs Decklink card (video source is SDI) and use VLC to capture and transcode the video, then send that to a Wowza server to be packetized for HTTP streaming. I have this working for Video, but I cannot figure out how to get the closed captions to appear in my player. I am using a VLC Player, Quicktime, Flowplayer on a web page, iPad, and iPhone to view the streams, none of them show the closed captioning in the source video. The closed captions are both 608 and 708. I have spent the last 3 days reading everything I can about vlc and closed captions, wowza and closed captions, and x264 and closed captions, vbi, teltext, and even subtitles, but I am no where closer than when I started. I sincerely that someone here will be able to help me.
I am using a core i7 with 4GB ram, runign ubuntu 10.04 64bit. I compiled vlc with the following:
./configure '--enable-xvideo' '--enable-sdl' '--enable-avcodec' '--enable-avformat' '--enable-swscale' '--enable-mad' '--enable-a52' '--enable-libmpeg2' '--enable-dvdnav' '--enable-faad' '--enable-vorbis' '--enable-ogg' '--enable-theora' '--enable-mkv' '--enable-flac' '--enable-caca' '--enable-alsa' '--enable-qt4' '--enable-ncurses' '--enable-realrtsp' '--enable-twolame' '--enable-real' '--enable-x264' '--with-decklink-sdk=/home/bimls/bmd/Blackmagic_Decklink_SDK_9.6.4/Linux' '--enable-zvbi'
I use the following to capture and stream the live video:
cvlc decklink:// --decklink-card-index="0" --decklink-mode="ntsc" --rtsp-timeout 0 --sout='#transcode{venc=x264{subme=1, ref=1, bframes=16, b-adapt=1, bpyramid=none, weightp=0 }, vcodec=h264, vb=1300, acodec=mp4a, ab=96, threads=4}:rtp{dst=127.0.0.1:8888,mux=ts}'' --vbi-page=100 --no-vbi-opaque --vbi-position=0
Some of the question I have are:
Is this the proper way to get the closed captions to appear? if not please what am I doing wrong?
How does one
know what vbi-page to look for as the range seems to be "--vbi-page integer [-2147483648 .. 2147483647]>"
Does transcoding destroy
closed caption data?
So, Please any help will be greatly appreciated! Thank you!
Related
Hello I am trying to capture stream with open cv with python. I am getting a delay of 6 seconds for opening the stream which i feel is very high. In the vlc im able to open the stream in 0.5 seconds at most. Is there any way to fast up the processing of opening the stream.
I was able to solve the issue by using the opencv Compiled with gstreamer and by making opencv to take default as gstreamer(with cv2.CAP_GSTREAMER) instead of ffmpeg to open the rtsp stream. With this i was able to open the stream in 1.5 seconds
Good day everyone!
So, as the title suggests, i am developing an app with similar functionality to that off Periscope and Facebook Live video streaming. Here is what the end goal is:
A Broadcasting device [user]
EC2 Instance [Hosting an ffmpeg transcoder]
Cloudfront Distrubution [CDN]
1 to n viewers of the live feed
I've been doing a lot of googling and what I cant seem to figure out is:
As you send chunks of video to the server from the Broadcaster, how do
you create an
.m3u8 playlist when you don't have all the chunks of video yet (e.g. the
device sends its first 5second chunk of video)?
It seems a .m3u8 file is created from a .mp4 file that is already complete, then broken down into chunks... But i'm sending chunks of the video to the server, how can it generate the .m3u8 file when more chunks are still coming from the Broadcaster, so the watchers / clients can continuously stitch together the video chunks?
I'll be happy to clarify this question further. Thanks!
If you take a look at the docs for the segment muxer you can specify the m3u8 to be outputted and you can also tell it to update the m3u8 as it goes. It might look something like this:
ffmpeg -i infile.mp4 -c:v copy -c:a copy -map 0 -f ssegment -segment_list playlist.m3u8 -segment_list_type hls -segment_list_size 10 -segment_list_flags +live -segment_time 4 outchunk%07d.ts
Note the segment_list_size is the maximum number of chunks referenced in the m3u8 file at one time and the segment_list_flags tells ffmpeg that this a live stream.
I think your confusion is that you are trying to send HLS fragments to their server. Don’t. Send a stream via another protocol like RTPM. Then let the server convert to HLS.
I use RTMP to stream from my iPhone to my server with FMS. I followed some tutorials and now I have the flv playback file in /webroot/live_recorded.
What i want to do is the following.
1) Stream from iPhone to server using RTMP : DONE
2) Stream back to iPhone using HLS : I don't understand the docs and i read hundreds of threafds but none helped me. I would like the user to read the stream from the beginning, as it is stored on my server. Thanks
i'm actually not about FMS.. i work with Wowza and i suppose you'll need something like nDVR feature or have someone write special module for you that will split live stream into small recordings, and so you'll need to play playlist of such recorded files from your iPhone.
hopefully someone will recommend true solution, not just some assumptions :)
How to serve videos like Youtube does ? Even if the video is long (almost 2 hours long) and is viewed in HD, it would almost instantly play and seeking to not yet loaded parts are very fast.
I'm using a dedicated server from Rackspace with 100Mb up/down for this test, my ping time is below 50ms to the server. My local internet connection is 10Mb, I could maximize my internet connection when I download something from the server so connection to the server is not the issue here.
I'm trying to emulate this and I've tried Real time streaming using Wowza and Pseudostreaming using the H264 Streaming Module. Neither could compare to how fast Youtube delivers video.
Video test file is MP4 (h.264), 300MB, 2 hours long, total bitrate is set to 500kbps, and JWPlayer as the video player
Wowza Streaming (RTMP) - Loading then playing the video is fast, but not as fast as youtube. Seeking is not as fast as well it takes
around 5 - 7 seconds to move to the new position and continue playing the video.
Pseudostreaming H264 Streaming Module (HTTP) - Loading the video takes a long time since its downloading the video header first before
playing it. A 2 hours video has around 2.5MB of MOOV ATOM (video
header file) that it needs to download first before it could play.
Once it starts playing seeking to not downloaded parts is on par with
Wowza but not as fast as Youtube.
What do I need to serve videos with the speed of Youtube? I also need it to buffer/download the video when paused just like Youtube so Real Streaming like Wowza is out.
Pseudostreaming using the H264 Streaming module would have been nice since it does buffer when paused, its just that the initial loading time is very long! Anyway I could remove that initial load time?
What are my other options? I'm open to any other option that I could use in my server.
The way YouTube works is different and they keep on changing the way it works. Doing the reverse engineering on that by capturing the YouTube feeds over wire-shark over last 4 years told me that the pattern is very dynamic. The segmentation is one key, the dual buffer, multiple caching servers and techniques, using the client machine as the buffer render and the functionalities of the player matters a lot. There are many many factors which make YouTube video fast and sleek.
You can emulate the same to some extent but building exactly the same needs loads of efforts and infrastructure.
I know there is this &hd=1 code to start a YouTube video in 720p. Is there a code or trick to add at the end of a YouTube video URL to start in 1080p?
Seems to be working again :)
Using an example:
before:
http://www.youtube.com/watch?v=ecsCrOEYl7c
after:
http://www.youtube.com/watch_popup?v=ecsCrOEYl7c&vq=hd1080
note both:
watch_popup
&vq=hd1080
Other possible values can be found here:
https://developers.google.com/youtube/iframe_api_reference#Playback_quality
You can also change the start time of the player by appending this (to 1 minute and 22 seconds in this example):
&t=1m22s
Some documentation can be found here:
https://developers.google.com/youtube/
It's not possible to set the quality to 1080p only with an URL. Some years ago it was possible by adding &fmt=37 but it doesn't work anymore.
However, if you can use JavaScript the YouTube API will allow you to select the quality.
From documentation:
hd (supported players: AS2)
Values: 0 or 1. Default is 0. Setting to 1 enables HD playback by default. This has no effect on the Chromeless Player. This also has no
effect if an HD version of the video is not available. If you enable
this option, keep in mind that users with a slower connection may have
an sub-optimal experience unless they turn off HD. You should ensure
your player is large enough to display the video in its native
resolution.
AS2 player will be retired in October 2012 and the embed codes on YouTube website load AS3 player by default. To show hd1080 you need to use JavaScript API. The functions are described here.