Can we do RTMP Streaming in ios app?
I have stream from cross devices using RTMP without delay or latency. I have tried HLS but its has more latency.
So please suggest, I can ready to buy the code or library, if it'll match my scenarios.
Technically, Yes. But apple will reject it from the app store if the content is longer that 10 minutes and uses cellular data.
https://developer.apple.com/library/ios/qa/qa1767/_index.html
Update:
It appears that apple does allow some apps to break this rule now. But YMMV
Related
I am developing an video player app on iOS and I am now thinking about how to support DLNA so that my app can mirror its online video to the TV with some DLNA-supported device.
Notice that the online video is playing on my app via WIFI or cellular network and I could switch it to be played on TV and my app becomes a remote control and a server to the TV.
Which framework should I use?
I already know Cyberlink and PlatinumKit.
I have worked on DLNA with iOS and Android device. I did not used the Cyberlink or PlatinumKit. I learned how it works and write with swift and java.
Here is my blog about the subject, if you only want the part of DLNA to find DLNA device and stream the video to the device, then control events like play, pause and seek. You could find all the material you need here.
https://eliyar.biz/DLNA_with_iOS_Android/
I am developing a social app for android and iOS,
iOS and Server work has started.
Our App needs to broadcast live audio/video to end users using our app.
We have tried using setting up Servers using RED5 and WOWZA.
In iOS we got crashes, buggy frameworks from RED5 iOS SDK for broadcaster so we moved for trial version of WOWZA
After implementing WOWZA GoCoder SDK for iOS we found that its license is too much costly for me of $8000 + $2000/year maintenance :(
The midnight coder seems to be buggy from the reviews (I have not used it for broadcaster client yet)
Can anybody recommend me for good iOS SDK or some custom way to implement live broadcast streaming from my mobile camera.
Any help will be highly appreciated.
Thanks
You can use NGINX RTMP for Live streaming server https://github.com/arut/nginx-rtmp-module
And in iOS for broadcasting you can use https://github.com/LaiFengiOS/LFLiveKit
both are free libraries
I am trying to build a audio/video streaming app that works cross platform on iOS and Android mobile devices.
No matter how deep I Google, I'm ending up with suggestions that point me towards OpenTok/TokBox API. But this is what I wish to avoid.
I've checked a few demo, but WebRTC/HTML5 do not seem to work with streaming video/audio in iOS browser. For example, the https://apprtc.appspot.com demo does not work in Safari or Opera Mini in iOS.
When I try http://dev.opera.com/articles/media-capture-in-mobile-browsers/demo/ ... I can capture image using the default iOS camera picker from my browser but streaming video fails.
It seems like the getUserMedia() stuff is not supported by any browser in iOS.
Moreover, I am planning to put this on a WebView in a native iOS app. This sounds like a really far cry.
I wish someone could point me towards something that helps me build a video streaming app (hopefully using HTML5), that works uniformly for iOS and android (without TokBox).
You might want to look into Ericsson's Bowser App http://www.ericsson.com/research-blog/context-aware-communication/bowser-openwebrtc-released-open-source. It claims to provide WebRTC on Android and IOS. Apparently the App is currently under review in the App Store so if you wait it may just be a case of downloading the App. However it's also open source so if you can't wait then you can build it yourself https://github.com/ericssonresearch/bowser.
getUserMedia and WebRTC Peer-to-peer connections APIs are not supported in iOS.
One of the reason is that at the moment efforts around WebRTC focus on VP8 video codec which Apple and Microsoft do not support natively. Support in the near future is unlikely with Microsoft pushing for its own standard.
Doing what you want on iOS requires you use a native iOS compatible solution like OpenCV which supports video capture. You can find on Google tutorials on how to implement a solution based on OpenCV.
good news, will be supported at Safari 11.0
https://developer.apple.com/library/content/releasenotes/General/WhatsNewInSafari/Safari_11_0/Safari_11_0.html
I have tried to live stream audio (AAC-LC) from iOS for 3 months without much success...
I tried Audio Queues, which work well but there is a strange delay (~4s) and I don't know why (high level API ?)
I tried Audio Units, it sometimes works on the simulator but never with the phone using a modified code from this source
I am really lost, can anyone help me ?
EDIT
I have to do a live streaming application (iPhone-> Wowza Server via RTSP). The video part works well with little delay (1s). Now I'm trying to add audio in addition to video but I'm stuck with the SDK.
tldr : I need to capture microphone input then send AAC frames over the network without getting huge delay
This app, which I just now completed, broadcasts audio between any two iOS devices on the same network:
https://drive.google.com/open?id=1tKgVl0X92SYvgpvbljRzilXNQ6iBcjqM
Compile it with the latest beta release of Xcode 9, and run it on two iOS 11 (beta) devices.
The app is simple; you launch it, and then start talking. Everything is automatic, from network connectivity to audio streaming.
Events generated by the app are displayed in an event log in the app:
Even though the code is simple and concise, the event log was provided to make understanding the app's architecture quicker and more easily.
I am new to creating Nokia application for the series 40 mobiles.
I am trying use YouTube channel videos for my application, but I am getting an error on the simulator:
"The RTSP streaming feature is currently not supported in the Web Apps Simulator. Please test the streaming feature on one of the supported devices."
I tested on supported devices using the Nokia deploy method, but it is still not working so please can anyone help me solve the problem?
I believe on Nokia S40 series phones, live video streaming is not possible. May be due to some technical limitations [or say design limitations].
But when you open a web link which streams videos, [Some times and some devices] it gets opened in your media player and gets started to stream. But due to low quality antennas used for internet in the S40 series [Obviously entry level phones] internet speed will be less and it seems like video buffering or video streaming is not possible.
May be the device's simulator which you are using does not supports live video streaming. If you are internet connection is proper [and speedy as well] then this is the root cause.