I'm trying to build an application which allows users to livestream stream from theirs webcam.
My main problem is I don't know how connect this webcam script to the red5 streaming server and how to embed it to the website.
I will be thankful for any guidance.
You must send stream to some RED5 RTMP chanel and connect the flash script to the chanel via RTMP to get stream. Next, you must have application in red5 side to use rtmp protocol for streaming try use oflaDemo application which is in default build and look how it works in it.
But streaming is not so hard here is some guides how to do it:
Simple guide with OlfaDemo
Guide 2
I hope this helps.
Related
I am getting a video feed in my app from a drone. The drone's SDK is giving me the video as Data or NSData into my app. I want to stream the same or divert the same to a server (for example a Wowza server). These two things should process simultanously.
You can use ffmpeg library for restreaming it.
There is a sample ffmpeg swift or objective c
I've been trying to send the video to a streaming service like YouTube for a year. The video coming into the iPad from the controller (connected to a Phantom 3 Advanced) is in H.263 format. ffmpeg is best at bulk transcoding, not a stream environment. Tried https://github.com/LaiFengiOS/LFLiveKit but it has bugs. It's a wrapper on ffmpeg that knows how to do RTMP.
The DJI go app knows how to do this, I've asked in the forum for help or code sample but they won't help. Bottom line is I can find no way to stream the video from the drone to a streaming service like YouTube or Wowza. I wish Wowza could accept H.263 native, but they site list only H.264.
So I can't give you an answer, but I can give you what I've figured out over the last year.
I have been exploring on how to live stream from iPhone. I will have to publish a stream at URL to Wowza server that I came to know. Other thing is that I will require a library for iOS to encode and compress the camera output and will have to send that stream over RTMP protocol to the Wowza server. At the receiving and, there should be a player which can decode, decompress that stream comes from the Wowza to the device like iPhone (a user who wants to see live stream).
My question is, if encoding is done through particular iOS SDK, RTMP has a role as a Protocol, a player at receiving end has a role of decoding, then what is the role of Wowza ? What is its function that makes it very important in the live streaming process ?
I have been searching on the function of a Media Streaming server since 3 days, but I could not understand the exact function of Media Streaming server like a Wowza.
I am desperate to the answer..
Any explanation will be appriciated, thanks in Advance !!!
I actually did a year-long project involving media streaming on iOS and I used Wowza. The role of Wowza is to function as a media server that can receive the video that is broadcast from an iOS device over the RTMP protocol. With Wowza, you have options to send http parameters that instruct the server to begin or stop recording the live video that is being streamed. You also have the option of embedding video players in websites for live view.
The problem is simple. In Wowza when I use a stream file in live mode (so the streaming are published only when requested through RTMP), I'm not able to publish the streaming and get the video streaming when I use the JWPlayer on IOS (so through Apple HLS) so i'm looking for a method for streaming the cams-live on iOS on demand!
I've just started in Wowza so I cannot write Java code to expand the functionality for the moment so be patient.
I've seen this post on stackoverflow, but i can't see the solution, Wowza: Need to stream rtp-live to iphone !
I'm developing a website who needs an external RTMP stream.
I'm using jwplayer to run the stream using Flash (examples and information about here).
My problem is the stream do not works at iOS.
Somebody suggests a solution?
IOS does not support rtmp protocol for that you has to use http protocols, i.e. ,http live streaming
I am new to multimedia and iOS programming and I came across Weborb while Googling, which provides RTMP library for iOS. It doesn't clearly mention that if it can be used to stream live video through a media server like Red5.
If any one have used this, please let me know that whether it can be used to stream live video from iPhone to a media server and where does it fit in the whole setup.
Does it act like a server itself between a media server and the iPhone application or does it also have its own media server?
I also want some links for tutorials which can help me start the real coding pertaining to RTMP streaming to a media server?
Thanks.
The short answer is yes, the RTMP library for iOS can be used with Red5, FMS, WebORB etc. The library is not the server itself, yet client. It establish the RTMP connection to the server and encodes stream before send it to the server.
As I remember the library distributive contains some example to demonstrate how streaming works. Unfortunately, the official site doesn't show any examples related to streaming, the available examples can be useful to start work with the library (http://www.themidnightcoders.com/products/weborb-for-mobile/ios-integration/rtmp-ios-examples-integration-between-java-net-and-ios.html). The documentation looks up to date - http://www.themidnightcoders.com/fileadmin/docs/ios/.