RTSP link not found in source - youtube

I am fixing some old RTSP YouTube player made for Symbian devices which is used in Blackberry or Nokia 3GP mobile phones.
Couple of weeks ago it was working and I was able to stream and watch the videos, since yesterday I get back to the code and try to continue my tasks but when I put the breakpoint on the GET RTSP link, it can not found anymore from response source.
Despite I wrote to YouTube if they stopped supporting this I get no answer and I am also paid YouTube developer and also I have paid Gmail and I feel like this covid situation makes everyone waiting.
If there is someone who have experience with this RTSP 3GP players and know some info please comment if this is deprecated or I should stop focusing on fixing this code player and look for additional task or it is working and I need some other info or source to make it work again..
My response when it was working get this rtsp link:
rtsp://r2---sn-q4flrn7r.googlevideo.com:554/Cj0LENy73wIaNAlQN5SaCYfUwBMYESARFC2LT9ZeMOCoAUIASARgyffQ77Xxk-teigELemo1OFdoSXNsSlkM/78FAEBAEB3541F6B4C128E9C594EFCF7BF50A2F9.38FCBF08BF6550B599C4AB4BE2FD62C2837C7802/yt8/1/video.3gp
USERAGENT: (I tried different agents but no success)
NokiaC5-00/061.005 (SymbianOS/9.3; U; Series60/3.2 Mozilla/5.0; Profile/MIDP-2.1 Configuration/CLDC-1.1) AppleWebKit/525 (KHTML, like Gecko) Version/3.0 Safari/525 3gpp-gba

Unfortunately, Google dropped RSTP support when API v2.0 was discontinued.

Related

Video Streaming and Broadcasting using WebRTC

I am very new to Real Time Protocols and I had some questions about how WebRTC works and how I can implement it. I am trying to create a one to many livestream like facebook or periscope, where one user broadcasts and other users join and stream the video. I am using Swift from my client end.
My questions are:
How do I broadcast a video using WebRTC
Is there an SDK for WebRTC in Swift/iOS
I know the questions are very vague but a guidance to the right direction would be great because I am not sure where to start
You will need to use backend servers for that.
If you plan on broadcasting to multiple users directly from your mobile app then stop...
You need to connect your mobile app to a backend media server which then can be used to broadcast the video to a larger audience.
There are several commercial and open source alternatives that enable you to do that. I'd check Red5Pro, Wowza, SwitchRTC, Jitsi, Janus and Kurento for this task.
For the client side, look at react-native-webrtc
You can find more tools for WebRTC developers here.
Regarding your question (2), there's also a SDK for iOS here and a neat get-started-page here (although 2.5ys old, but I haven't found anything better so far yet)

Re-stream a live video from iphone to a server

I am getting a video feed in my app from a drone. The drone's SDK is giving me the video as Data or NSData into my app. I want to stream the same or divert the same to a server (for example a Wowza server). These two things should process simultanously.
You can use ffmpeg library for restreaming it.
There is a sample ffmpeg swift or objective c
I've been trying to send the video to a streaming service like YouTube for a year. The video coming into the iPad from the controller (connected to a Phantom 3 Advanced) is in H.263 format. ffmpeg is best at bulk transcoding, not a stream environment. Tried https://github.com/LaiFengiOS/LFLiveKit but it has bugs. It's a wrapper on ffmpeg that knows how to do RTMP.
The DJI go app knows how to do this, I've asked in the forum for help or code sample but they won't help. Bottom line is I can find no way to stream the video from the drone to a streaming service like YouTube or Wowza. I wish Wowza could accept H.263 native, but they site list only H.264.
So I can't give you an answer, but I can give you what I've figured out over the last year.

Live Streaming App iOS

I am trying to develop a live-streaming application like the meerkat app, where user A can broadcast a live stream while other users are able to watch it. I am having trouble understanding the architecture and mechanisms used to upload video to a server. Currently, I am using a dedicated server with FFMPEG installed on it. I also know FFServer can be used to perform RTSP communication, but I am still unclear how to do this. Can anyone guide me on this?
I would like to know how to upload videos to a server or whether there is another way to perform a live stream. Open source frameworks are welcome.
for Live streaming video/audio http://www.wowza.com/ give you the best functionality. you have to set up your server in WOWZA also you cant test in that.
for IOS you can broadcast and receive from the below demo you can download from here
i think it's helpful to you :)
Well i was in search of something open source which can be implemented without any additional cost. Luckily found Red5 Server (Open Source) https://github.com/Red5/red5-server
I had configured it on my dedicated server and running perfectly fine. Now as server side issue is solved. I need something thing to work on iOS side. For that also i found https://github.com/slavavdovichenko/MediaLibDemos3x
So with the combination of this two repos i was able to make an live streaming app like meerkut
Thanks

Can Weborb be used to do live video streaming from an iPhone through a media server?

I am new to multimedia and iOS programming and I came across Weborb while Googling, which provides RTMP library for iOS. It doesn't clearly mention that if it can be used to stream live video through a media server like Red5.
If any one have used this, please let me know that whether it can be used to stream live video from iPhone to a media server and where does it fit in the whole setup.
Does it act like a server itself between a media server and the iPhone application or does it also have its own media server?
I also want some links for tutorials which can help me start the real coding pertaining to RTMP streaming to a media server?
Thanks.
The short answer is yes, the RTMP library for iOS can be used with Red5, FMS, WebORB etc. The library is not the server itself, yet client. It establish the RTMP connection to the server and encodes stream before send it to the server.
As I remember the library distributive contains some example to demonstrate how streaming works. Unfortunately, the official site doesn't show any examples related to streaming, the available examples can be useful to start work with the library (http://www.themidnightcoders.com/products/weborb-for-mobile/ios-integration/rtmp-ios-examples-integration-between-java-net-and-ios.html). The documentation looks up to date - http://www.themidnightcoders.com/fileadmin/docs/ios/.

How to stream live audio from iOS device to Wowza media server?

I'm trying to find out how i can use wowza media server(which i just installed) to receive a live audio stream from an iOS device(Xcode simulator) and play it on a web browser(safari) using http live streaming. i just need direction, guidance or a tutorial to start with this, or just the basic concept of how it works.
sorry for the newbie question, but i really really did try digging up the documentation, their is nothing about iOS http live streaming specifically, they concentrate more on Flash streaming(flv).
Thanks.
Use Wowza's MPEG-TS capabilities
There are a couple of Google hits and topics on the Wowza forums. This one looks fine for getting you started: Using an MPEG Transport Stream (MPEG-TS) encoder with Wowza Pro (MPEG-TS).
You can then just click a simple link in Safari on iOS devices and it should work fine. The like will look something like:
http://myserver.com/appName/myStream/playlist.m3u8
Wowza recently released an addon called "Gocoder". By using it you can encode your live streaming from wowza supported ios devices and broadcast it to any screen.
You can download the encoder from here
https://itunes.apple.com/us/app/wowza-gocoder/id640338185?ls=1&mt=8
Here is the step to configure the Gocoder addon with live application
http://www.wowza.com/forums/content.php?500

Resources