Is Youtube RTMP server notifies user's App that stream is stopped on RTMP server's side? - youtube

In my app I create RTMP stream on Youtube using livestream RTMP url and secret key, without using Youtube API directly. After stream started I'm receiving from server only timestamp type chunks with no messages, and nothing more. Chunks don't stop receiving, even if I stop live stream in my Youtube channel's studio and no chunk with "Stream is stopped" message doesn't appear. Is Youtube RTMP server didn't monitoring such cases, when RTMP livestream was created manually, without Youtube API?

Related

WebRTC P2P stream a youtube video

I started learning about the WebRTC and interested if the API could be used for peer-to-peer streaming of a Youtube video for example. I could not find any articles on this. Would it be possible to use the API to stream and synchronize a Youtube video to two people in real-time?
No you will not be able to use YouTube as a WebRTC peer. The media stream from YouTube will not be able to perform the STUN and DTLS exchanges or setup the required SRTP stream.
What you could do is write a custom application that acted as an intermediary between YouTube and WebRTC peers. The custom application would need to be able to pull the stream down from YouTube and then forward it to any WebRTC peers that connected to it.
You need an intermediary gateway to do that.
I have read on this page you can convert WebRTC stream to RTMP H.264+AAC for YouTube Live.They use Flashphoner.

What is the role of Streaming server like Wowza?

I have been exploring on how to live stream from iPhone. I will have to publish a stream at URL to Wowza server that I came to know. Other thing is that I will require a library for iOS to encode and compress the camera output and will have to send that stream over RTMP protocol to the Wowza server. At the receiving and, there should be a player which can decode, decompress that stream comes from the Wowza to the device like iPhone (a user who wants to see live stream).
My question is, if encoding is done through particular iOS SDK, RTMP has a role as a Protocol, a player at receiving end has a role of decoding, then what is the role of Wowza ? What is its function that makes it very important in the live streaming process ?
I have been searching on the function of a Media Streaming server since 3 days, but I could not understand the exact function of Media Streaming server like a Wowza.
I am desperate to the answer..
Any explanation will be appriciated, thanks in Advance !!!
I actually did a year-long project involving media streaming on iOS and I used Wowza. The role of Wowza is to function as a media server that can receive the video that is broadcast from an iOS device over the RTMP protocol. With Wowza, you have options to send http parameters that instruct the server to begin or stop recording the live video that is being streamed. You also have the option of embedding video players in websites for live view.

wowza android mobile stream doesn't work

I configure the wowza Stream server for vod and live streaming. Then I try to stream rtsp url in vlc In my machine, it’s work fine. But I try same vod rtsp url in my android mobile phone it doesn’t work. I try different machine vlc player to stream same url it also doesn’t work. Why it happen. How can I configure wowza server in android rtsp mobile streaming.

CMSampleBufferRef to Audio Video for RTMP streaming

I am using iOS libRTMP to connect to an RTMP server. Now my problem is, how can I send the audio and video to an RTMP server? I've seen that CMSampleBufferRef is used for processing and sent to the server. The data is being processed with some flash video like metadata.
I tried VideoCore but it fails with the server I am using(Evostream) but works with other servers. Medialibs works but I need an open source option.
How do we convert CMSampleBufferRef to a format that RTMP understands?

you tube live streaming api not stream the video

I checked for the youtube live streaming api and it provides the facility for stream the video over the youtube.
But how can we access it in the development?
How can we use this api (youtube live streaming) and stream the video in the youtube?
You will have to enable "Live" streaming feature in your account. Looks like you will have to be eligible to have it enabled.
http://www.youtube.com/live/all
If its enables, you can push your pre-recorded video files/live video to the google's publishing point. For that you can use any supported media encoders. More info can be read here on how to set it up.
https://support.google.com/youtube/answer/2907883?hl=en&ref_topic=2853713
Once thats setup, you can stream events as if its a real live event !

Resources