I'm currently running an Expo project with react-native-webview 9.4.0 installed, and I'm trying to use the WebView to play audio from a local .html file with a custom JS script using the Web Audio API. The audio works fine on the iOS simulator and on an iPhone when not on Silent Mode, but once I turn on Silent Mode it's impossible to get any audio to play.
There seems to have been a couple attempts to fix this issue over the past couple months (https://github.com/react-native-community/react-native-webview/pull/1218) and (https://github.com/react-native-community/react-native-webview/issues/1140), but even with the most recent version of react-native-webview I still can't get the Web Audio to play with the iPhone muted. I've tried various workarounds from StackOverflow without any success either. Does anyone else have this issue or have a workaround to suggest?
Related
I am creating an app which uses audioplayers package for playing audio files and initially the audio plays perfectly but after few more rounds the audio starts playing after a delay (like 5 seconds or even longer).
This is an issue I guess many are facing as there are ton of questions but no proper answers to it. This is one answer I found but the issue still persist i guess. If anyone has a workaround for this problem please share here
I am using a real iOS device (not tested on android device), flutter version 2.0.1, stable channel
I want to run my html5 audio player which is online streaming radio link to be played even if the device is locked (sleep mode) or the app is turned in background while opening another app. The player works very well when I start it as web app from Safari browser but now after I have built it for ios using Phonegap Build when the device is locked or the user switched between the apps in the device then the player stops. Is there any solution for this? Maybe a plugin?
Here is my web app in github: https://github.com/albpower/radio-pendimi
I have solved this problem using Background-audio ios Phonegap plugin. Here you can find more informations about plugin and usage from official Phonegap plugins directive.
https://build.phonegap.com/plugins/1193
I'm writing mobile application with Adobe AIR. The application use AIR Microphone API to record sound to file and later replay it.
The problem manifest only on mobile devices, not simulator. Specifically only on iOS devices, android devices seem to work OK.
Sometimes the recorded sound is missing samples. I know this because I use iFunBox to copy the recorded file to another application that replay it. The dropped frame manifest during playback as very fast audio because only part of the samples were recorded.
Sometime the playback is to slow which manifest as very slow audio. I know this because when the recording is fine and the other application play the sound right or when I take a file I recorded in the simulator (which run on my MacBook) and it only play slow on the mobile device.
How can I make sure the sound is good even when the application is a bit busy?
I've built the application as ad-hoc package and install it on iPad using TestFlight and now everything seem to work just fine.
I guess during debug Adobe AIR did not manage to fill the sound buffer fast enough and cause the distortion.
I have tried to live stream audio (AAC-LC) from iOS for 3 months without much success...
I tried Audio Queues, which work well but there is a strange delay (~4s) and I don't know why (high level API ?)
I tried Audio Units, it sometimes works on the simulator but never with the phone using a modified code from this source
I am really lost, can anyone help me ?
EDIT
I have to do a live streaming application (iPhone-> Wowza Server via RTSP). The video part works well with little delay (1s). Now I'm trying to add audio in addition to video but I'm stuck with the SDK.
tldr : I need to capture microphone input then send AAC frames over the network without getting huge delay
This app, which I just now completed, broadcasts audio between any two iOS devices on the same network:
https://drive.google.com/open?id=1tKgVl0X92SYvgpvbljRzilXNQ6iBcjqM
Compile it with the latest beta release of Xcode 9, and run it on two iOS 11 (beta) devices.
The app is simple; you launch it, and then start talking. Everything is automatic, from network connectivity to audio streaming.
Events generated by the app are displayed in an event log in the app:
Even though the code is simple and concise, the event log was provided to make understanding the app's architecture quicker and more easily.
When I try to play video from m.youtube.com I get an error:
an error has occured attempting to play media.
How do I solve this?
Make sure you are running the MDS Simulator (start it before the simulator), otherwise most applications that use networking on the simulator (including the web browser) will not work.
EDIT: as noted below, YouTube mobile uses RTSP which isn't supported in the simulators
Try playing around with different versions of the OS in your emulator. Be aware, however, that Blackberry support for Flash video is virtually nonexistent even on their handsets, so you're bound to run into issues.