I am having issues with downloading and playing an HLS URL at the same time.
I have followed WWDC'16 video and also below link.
But either I am able to play the audio or I am only able to download it. I am not able to achieve doing both the tasks at the same time.
And the download is also working only on iOS 11 beta releases, not on iOS 10 devices.
Am I missing something or there is some other way of achieving it?
There are a lot of changes related to HLS and FPS for iOS 11
And there are two examples from Apple for developers:
FairPlay Streaming Server SDK beta - only for XCode 9
FairPlay Streaming Server SDK - for XCode 8
As for me: on iOS 10 I can play and download HLS+FPS without any issues, but the same code for iOS 11 is not working (as I wrote - there a lot of new instances and workflows in it).
Related
I have an app in Android and iOS that uses Sinch videocalls but since the iOS 13 launching the video the iOS 13 sends is bad, it looks like static. I've discovered that if during the videocall I use the iPhone in landscape orientation the video transmitted is normal. This problem happens even with the sinch demo. Any ideas to solve this?
I've tried: use multiple connections, updating the framework to the last version, in multiple devices, modifiy the frame from the callAsyncLocalVideoFrameHandler.
I expect to be able to send good video from devices with iOS 13 of higher and iPadOS.
we are aware of the issue, it's caused by a compatibility issue between iOS 13's H264 Codec and the WebRTC version we use on our SDK.
We are working to fix that. There will be a new major iOS SDK release to support iOS 13, at the moment we do not recommend customers to release new versions of their apps with iOS 13/Xcode 11 support.
Your current apps will still work on devices running iOS <12 and the new iOS 13, as long as they are not built with XCode 11.
iOS 13 brings also new directives to use VoIP Push on Apple devices, this will also cause changes on how customers integrate any VoIP SDK, if you use VoIP Push notifications.
See note and links on our website.
https://www.sinch.com/docs/resources/downloads/index_vvv.html#info-ios-13-voip-push-changes
Jorge Siqueira - Sinch Voice & Video.
I also see problems with iOS 13 and Sinch Video call. I have tried to go back and build with Xcode 10.3 but that does not help either. iOS 12 still works like you say
Audio in HMTL5 video works fine in Safari on iOS 10.3.1 on iPhone, though it doesn't in standalone web apps (same html code & video file). Video play is fine. Just no sound.
There are a few other related discussions in the past, eg, Why HTML5 video doesn't play in IOS 8 WebApp(webview)?. I tested on iOS 10 using the html provided by that post.
Not sure whether it is a new bug introduced in iOS 10, or a bug that has never been fixed by Apple since earlier versions. Does anyone experience this issue? Are there any workarounds? Thanks in advance.
I had the same question. I googled all over the place. Then I realized what was causing the problem: I had the physical "soft mute" switch (next to the volume buttons) on my iPad turned on. Infuriatingly, this muted the volume on web apps, but not Safari web pages.
I am trying to build a audio/video streaming app that works cross platform on iOS and Android mobile devices.
No matter how deep I Google, I'm ending up with suggestions that point me towards OpenTok/TokBox API. But this is what I wish to avoid.
I've checked a few demo, but WebRTC/HTML5 do not seem to work with streaming video/audio in iOS browser. For example, the https://apprtc.appspot.com demo does not work in Safari or Opera Mini in iOS.
When I try http://dev.opera.com/articles/media-capture-in-mobile-browsers/demo/ ... I can capture image using the default iOS camera picker from my browser but streaming video fails.
It seems like the getUserMedia() stuff is not supported by any browser in iOS.
Moreover, I am planning to put this on a WebView in a native iOS app. This sounds like a really far cry.
I wish someone could point me towards something that helps me build a video streaming app (hopefully using HTML5), that works uniformly for iOS and android (without TokBox).
You might want to look into Ericsson's Bowser App http://www.ericsson.com/research-blog/context-aware-communication/bowser-openwebrtc-released-open-source. It claims to provide WebRTC on Android and IOS. Apparently the App is currently under review in the App Store so if you wait it may just be a case of downloading the App. However it's also open source so if you can't wait then you can build it yourself https://github.com/ericssonresearch/bowser.
getUserMedia and WebRTC Peer-to-peer connections APIs are not supported in iOS.
One of the reason is that at the moment efforts around WebRTC focus on VP8 video codec which Apple and Microsoft do not support natively. Support in the near future is unlikely with Microsoft pushing for its own standard.
Doing what you want on iOS requires you use a native iOS compatible solution like OpenCV which supports video capture. You can find on Google tutorials on how to implement a solution based on OpenCV.
good news, will be supported at Safari 11.0
https://developer.apple.com/library/content/releasenotes/General/WhatsNewInSafari/Safari_11_0/Safari_11_0.html
It is documented all over the web that the iOS does not support multiple HTML5 audio. I ran a test and here are the observation:
iOS 5 (iPad1) - can play only one audio at a time.
iOS 6 (iPad2) - can play multiple audio.
iOS 7 (iPad3) - can play multiple audio.
It looks like on iOS 6 & 7, we can play multiple HTML5 audio, but to my surprise i could not find any discussion regarding this over the web, also apple documentation still says "Currently, all devices running iOS are limited to playback of a single audio or video stream at any time"
I am not sure if multiple HTML5 audio will work across all devices which support iOS6 and iOS7, also will apple continue to support this with next version of iOS, does anyone has any idea about this?
I have tried to live stream audio (AAC-LC) from iOS for 3 months without much success...
I tried Audio Queues, which work well but there is a strange delay (~4s) and I don't know why (high level API ?)
I tried Audio Units, it sometimes works on the simulator but never with the phone using a modified code from this source
I am really lost, can anyone help me ?
EDIT
I have to do a live streaming application (iPhone-> Wowza Server via RTSP). The video part works well with little delay (1s). Now I'm trying to add audio in addition to video but I'm stuck with the SDK.
tldr : I need to capture microphone input then send AAC frames over the network without getting huge delay
This app, which I just now completed, broadcasts audio between any two iOS devices on the same network:
https://drive.google.com/open?id=1tKgVl0X92SYvgpvbljRzilXNQ6iBcjqM
Compile it with the latest beta release of Xcode 9, and run it on two iOS 11 (beta) devices.
The app is simple; you launch it, and then start talking. Everything is automatic, from network connectivity to audio streaming.
Events generated by the app are displayed in an event log in the app:
Even though the code is simple and concise, the event log was provided to make understanding the app's architecture quicker and more easily.