iOS Safari Web RTC Streaming quality - ios

I am trying to stream HD video from the smartphone's camera via WebRTC in the browser.
(I am using Wowza as media server on the backend.)
This is working, and produces good quality on Android, but on iOS, the quality seems to be really bad and blurry.
I am testing this on a very fast internet connection, so this should not be the issue.
I am assuming that I somehow need to adjust the SDP settings, but setting AS and TIAS to high values doesn't seem to do anything really.
Oh, iOS is using h264.
So my question is: is there a setting I'm missing or is WebRTC on iOS / Safari currently not configurable in a way to use higher bandwidth?
Thanks!

Related

How to prevent Web-Audio on iOS from switching to mono when accessing the microphone?

we are developing a web app, that is supposed to play high quality stereo in combination with accessing the microphone input. We got this to work on all android and pc browsers, but the iphone is refusing to do this properly. We nailed the problem down to the access of the microphone input by "getUserMedia". Web audio plays stereo, until the microphone is enabled. Then, the quality drops and the output goes to mono. I have researched this problem in the internet, but only found posts that are several years old. My hope is, that things have changed in the meantime and solutions have been found. It seems like the phone is switching into some kind of "call mode". I would like to avoid this by either overriding corresponding settings or maybe by using a different way than using web-audio to play the stereo signal. I am open for any ideas. The worst case seems that we have to develop a dedicated native app for ios. If there is any workaround to make this happening in a web app, this would be highly appreciated. If desired, I can provide code snipplets, but I think, at this moment the problem should be clear.
BTW, in android we had similar problems and found that "Dolby Atmos" setting is causing strange mixing down to mono under certain circumstances. Switching it off fixed the android issues. Maybe this helps somebody else and maybe there are global settings on the iphone as well that could cure the problem..
Thank you very much in advance!
Cheers,
Chris
The worst case seems that we have to develop a dedicated native app for ios.
Unfortunately, this is the path forward most likely to yield success for you (if you haven't already figured this problem out, since I'm answering your question almost a year later).
The audio I/O device landscape is complicated, and there are several standards and factors that play into the quality of audio input and output an application yields. For instance:
Is the audio input device the same as the audio output device?
If yes, is the audio I/O device Bluetooth?
If yes, it's unlikely that the Bluetooth audio device supports stereo audio out and simultaneous audio input. Few I/O devices support that, and few host devices support that.
If yes, which Bluetooth version does the I/O device support?
Which operating system is the host device running?
Does the host device's operating system support the selection of audio input and output devices separately?
How much access to the host operating system does your application have?
For example, if your application is running in a browser, your application will have significantly less control over the host device's operating system's audio subsystem.
I have been trying to work around this problem recently, too. Another developer did a detailed investigation of this using a spectrum analyser and scope, and didn't have any luck either: testing iOS audio play/record with scope.
I think building a native app will end up being inevitable, and would also fix the myriad other problems with Safari web audio. Either that or wait until Apple fixes the bugs or implements AudioWorklets.

Video Streaming in iOS through WebRTC

I am trying to build a audio/video streaming app that works cross platform on iOS and Android mobile devices.
No matter how deep I Google, I'm ending up with suggestions that point me towards OpenTok/TokBox API. But this is what I wish to avoid.
I've checked a few demo, but WebRTC/HTML5 do not seem to work with streaming video/audio in iOS browser. For example, the https://apprtc.appspot.com demo does not work in Safari or Opera Mini in iOS.
When I try http://dev.opera.com/articles/media-capture-in-mobile-browsers/demo/ ... I can capture image using the default iOS camera picker from my browser but streaming video fails.
It seems like the getUserMedia() stuff is not supported by any browser in iOS.
Moreover, I am planning to put this on a WebView in a native iOS app. This sounds like a really far cry.
I wish someone could point me towards something that helps me build a video streaming app (hopefully using HTML5), that works uniformly for iOS and android (without TokBox).
You might want to look into Ericsson's Bowser App http://www.ericsson.com/research-blog/context-aware-communication/bowser-openwebrtc-released-open-source. It claims to provide WebRTC on Android and IOS. Apparently the App is currently under review in the App Store so if you wait it may just be a case of downloading the App. However it's also open source so if you can't wait then you can build it yourself https://github.com/ericssonresearch/bowser.
getUserMedia and WebRTC Peer-to-peer connections APIs are not supported in iOS.
One of the reason is that at the moment efforts around WebRTC focus on VP8 video codec which Apple and Microsoft do not support natively. Support in the near future is unlikely with Microsoft pushing for its own standard.
Doing what you want on iOS requires you use a native iOS compatible solution like OpenCV which supports video capture. You can find on Google tutorials on how to implement a solution based on OpenCV.
good news, will be supported at Safari 11.0
https://developer.apple.com/library/content/releasenotes/General/WhatsNewInSafari/Safari_11_0/Safari_11_0.html

Streaming AAC+ (inside FLV) on iOS using AIR

For a streaming radio station, I have an AAC+ audio stream, inside an FLV container, delivered via HTTP. An example URL is http://3023.live.streamtheworld.com/ALTROCK_S01A_AAC. I wrote a simple AIR app (using the latest AIR and Flex SDK's) to play this stream, and it works fine on PC and Android, but doesn't play anything when deployed to the iOS simulator or a device (i.e., the bytes are loaded but there is no sound).
This is similar to Can FLV AAC stream be played in Android, but for iOS.
I wanted to use AIR in this scenario, since I need to listen for the Cue Points in the FLV - and this is easy to do if you're playing Flash in a web browser, so AIR seems like the natural choice. I have also looked at http://code.google.com/p/haxecast/ and https://code.google.com/p/project-thunder-snow/ but they all seem to use the same basic idea (parse the FLV using Netstream in "data generation mode" and feed the AAC+ data to a Video object) - and so they all hit the same wall on iOS.
I also came across this post which seems possibly related although it's not quite the same situation (e.g., it's not FLV).
Is AIR on iOS supposed to support this scenario- namely, streaming AAC+/FLV audio via HTTP?
EDIT: This post also appears to hit the same obstacle - so a lot of people are asking about this situation. Anyone from Adobe have any insight?
After much further research I've concluded that AIR on iOS just doesn't support this, and you have to build a native app (or at least use framework other than AIR) instead.

RTSP streaming not working in Nokia series 40 mobile phones

I am new to creating Nokia application for the series 40 mobiles.
I am trying use YouTube channel videos for my application, but I am getting an error on the simulator:
"The RTSP streaming feature is currently not supported in the Web Apps Simulator. Please test the streaming feature on one of the supported devices."
I tested on supported devices using the Nokia deploy method, but it is still not working so please can anyone help me solve the problem?
I believe on Nokia S40 series phones, live video streaming is not possible. May be due to some technical limitations [or say design limitations].
But when you open a web link which streams videos, [Some times and some devices] it gets opened in your media player and gets started to stream. But due to low quality antennas used for internet in the S40 series [Obviously entry level phones] internet speed will be less and it seems like video buffering or video streaming is not possible.
May be the device's simulator which you are using does not supports live video streaming. If you are internet connection is proper [and speedy as well] then this is the root cause.

H.264 encoded MP4 presented in HTML5 plays on Safari but not iOS devices

I'm using Adobe Media Encoder CS5 to encode a FLV file to H.264 to present on the web via HTML5 and the video file plays just fine in Safari in OS X (and in Firefox encoded to OGG) but on any iOS device (iPad, iPhone) I get the play icon with the slash running through it.
Has anyone encountered this before and if so, any ideas as to why?
Thanks.
We had this problem and found that encoding the files in accordance with iPhone's webview's standards created files that played fine.
Not all H.264 encoded Mp4 files are supported by iPhone (or Chrome for that matter) and slight differences in the encoding process can produce videos that do not work. Even if the EXACT same encoding settings were used, H.264 is a variable bit-rate encoder, so different videos may exceed bitrate limits, causing some to work and other not.
The encoding settings that were successful for us were:
Only use the H.264 Baseline Profile Level 3.0
Resolution below 640 x 480 and framerate up to 30 fps
B frames are not supported in the Baseline profile.
bitrate limit of 900kb.
Here is the reference we used to arrive at those settings.
I know this has a marked answer, but we had the exact same issue.
The problem ended up being a setting on our internal network.
After turning on the safari console on the iPad, we saw that when trying to load the video we were getting a "byte_range_error_message" being logged. It seems the mobile devices request this content differently than desktop devices, by requesting certain bytes at a time. We managed to find out that the video played fine when the phones used their 3G networks, but not when they loaded the video through the internal wifi. A lot of research later led us to this MSDN article:
http://support.microsoft.com/kb/922330
Which explained how to find the setting in our firewall to allow the devices to request the video properly. We also found a similar setting on one of our D-Link routers for a separate wifi network that was also having the problem.

Resources