is this possible to create live event by simply using video file instead of web camera? I don't see option like this in live event creation
For doing this directly on youtube: No
For doing this by encoding some video file and push to youtube in real time: Yes
How to do?
Try wirecast play. Just like a live-feed console but free with some limit. Also other rtmp server may work. One of them is ffmpeg. I tried before and can ensure it works. But it's a backend with only command line. For more functionality, you need a front-end app(you can stream/pipe to ffmpeg).
About ffmpeg rtmp read this:
https://www.ffmpeg.org/ffmpeg-protocols.html#rtmp
Related
i am streaming some FTA channels from
http://www.tbsdtv.com/products/tbs6985-dvb-s2-quad-tuner-pcie-card.html
using mediaportal
http://www.team-mediaportal.com/
and then i get rtsp url from mediaportal of channel i timeshift
and vlc i can send that stream to mediaserver FMS to get HLS, HDS, RTMP, RTSP
i have 3 servers running erlyvideo (flussonic)
so it take care of the delivery.
i want some alternate solution beside that
i have done some methods to work this our
including
VLC
IPTVL
Dvbdream
but the quality is better when i stream some thing as file, only FMLE works good with live stream, but for that we only can use directshow enabled devices like
http://www.viewcast.com/products/osprey-cards
i am doing it on windows.
if some one have any more methods or want to share his version please do so
I use RTMP to stream from my iPhone to my server with FMS. I followed some tutorials and now I have the flv playback file in /webroot/live_recorded.
What i want to do is the following.
1) Stream from iPhone to server using RTMP : DONE
2) Stream back to iPhone using HLS : I don't understand the docs and i read hundreds of threafds but none helped me. I would like the user to read the stream from the beginning, as it is stored on my server. Thanks
i'm actually not about FMS.. i work with Wowza and i suppose you'll need something like nDVR feature or have someone write special module for you that will split live stream into small recordings, and so you'll need to play playlist of such recorded files from your iPhone.
hopefully someone will recommend true solution, not just some assumptions :)
According the Chromecast Developers page, Chromecast supports the SmoothStreaming container, which I believe uses video chunks with the .ismv extension. I am having problems getting those video files to play.
If I am not mistaken, Chrome/Chromecasts's implementation of the video tag only supports .mp4 and .webm files, so using cast.MediaLoadRequest (in a Chrome Sender App) would not work if you pass it a url for a manifest file or .ismv container.
It does seem possible to write code that stitches together MPEG-DASH chunks using the MediaSource API from a MPEG-DASH manifest file. However, it doesn't appear that Chrome's implementation of the MediaSource spec supports .ismv chunks and therefore a means to play smooth streaming video.
Assuming you parsed a manifest file to get the smooth streaming video chunks, how would it be possible for Chromecast to play .ismv h.264 containers, such as the ones that can be found here? Or does Chrome not support .ismv files? If so, what SmoothStreaming containers does Chrome/Chromecast support?
Chromecast supports MPEG-DASH and Smooth Streaming. See more detail here:
https://developers.google.com/cast/supported_media_types
We'll provide some code snippet of smooth streaming soon. Stay tuned.
The default Receiver provided does not support SmoothStreaming (nor MPEG-DASH).
You'll need to code your own receiver to do so.
See https://stackoverflow.com/a/17978070/2665789 for a little more help.
Hopefully Google posts working samples of Live streaming soon!
You can throw SmoothStreaming to some sample receivers provided by Google.
The cast-custom-receiver and the Cast-Media-Player-Library-Sample supports SS with PlayReady encryption out of the box.
Well, you need to do some tricks like modify the extension from "ism/" to "ism/Manifest" and it just work. You'll need to do the same in the [cast-sender-tool-chrome] adding the file extension to the list of three inside the main html file.
I am trying find a way to transcode an rtsp stream to HTTP (iOS) so that I can view a rtsp stream on a ipad. The video is embedded in our SaaS web view, and launching a third party player is not a possibility.
I found sirannon which according to the documentation can do this no prob.
However I am puzzled on how to actually execute.
Our rtsp stream is as such rtsp:\xxx.xxx.xxx.xxx:554\ch0_unicast_firststream
there is no .sdp file or anything. And vlc can play it fine.
But if i open a browser and attempt to open http://localhost:8080/RTSP-proxy/192.168.33.216/ch0_unicast_firststream
or
http://localhost:8080/RTSP-proxy/192.168.33.216:554/ch0_unicast_firststream
it gives me this error
[1516250] Warning: core.HTTP-server: Handling RuntimeError: Could not
guess container type for
URL(/RTSP-proxy/192.168.33.216/ch0_unicast_firststream)
(core.HTTP-server.session-42)
So far I haven't found any goo dexamples using sirannon. I am also open to using VLC, but again, do not know if or how to do a rtsp to http conversion with VLC.
With wowza we are able preemptively start the root stream ahead of the http stream request.
With a delay of about 3 seconds, the client is able to connect nicely. Then we monitor the client list to see if it is empty, if so, we kill the stream.
I've been digging into this for personal use and have gotten the same error. After some digging (notably in src/Communicator/HTTP/HTTPSession.cpp), I've made some progress.
While it's not explicitly in the documentation, it seems that you have to add the container type to the app type in the proxy, just as if you were streaming a file. As such:
http://localhost:8080/RTSP-proxy#[CONTAINER_TYPE]/192.168.33.216:554/ch0_unicast_firststream
There is a WWW page with Flash stream on it. I want to download and forward this stream to another streaming server, when possible - replace audio stream (e.g. translate), but without recompressing video stream. Usual way for this ATM is to capture and broadcast Flash player view from the web page, which is obviously suboptimal because video needs to be recompressed, making the quality notably worse and loading the cpu.
Has someone an idea how to do it? VLC seems to be able making relay, but it also seems not to support RTMP at all.
if you're ready to do this programmatically you can use crtmpserver (C++) or red5 (Java) with any RTMP client, otherwise this question doesn't belong to SO