I am creating an app which uses audioplayers package for playing audio files and initially the audio plays perfectly but after few more rounds the audio starts playing after a delay (like 5 seconds or even longer).
This is an issue I guess many are facing as there are ton of questions but no proper answers to it. This is one answer I found but the issue still persist i guess. If anyone has a workaround for this problem please share here
I am using a real iOS device (not tested on android device), flutter version 2.0.1, stable channel
Related
I am trying to enable "Designed for iPad" as a target for my functioning Flutter based iOS app. My understanding is that the app runs the iPad version in a VM when the user is on an M1 (or Apple Silicon) mac.
The app uses just_audio, all features work fine and I can load a file and play (I am using a StreamAudioSource derived from an encrypted file downloaded on disk). When I try to seek in the file I get this error from the iOS side and the audio does not seek but continues playing from where it was (I see a buffering and ready state).
AQMEIO_HAL.cpp:736 kAudioDevicePropertyMute returned err 2003332927
There is very little info on this error apart from this pretty old Apple developer post
https://developer.apple.com/forums/thread/672311
If I play a file as a stream (encrypted HLS) then seek works exactly as expected.
Has anyone had any experience with just_audio targeted this way?
Thanks!
I'm currently running an Expo project with react-native-webview 9.4.0 installed, and I'm trying to use the WebView to play audio from a local .html file with a custom JS script using the Web Audio API. The audio works fine on the iOS simulator and on an iPhone when not on Silent Mode, but once I turn on Silent Mode it's impossible to get any audio to play.
There seems to have been a couple attempts to fix this issue over the past couple months (https://github.com/react-native-community/react-native-webview/pull/1218) and (https://github.com/react-native-community/react-native-webview/issues/1140), but even with the most recent version of react-native-webview I still can't get the Web Audio to play with the iPhone muted. I've tried various workarounds from StackOverflow without any success either. Does anyone else have this issue or have a workaround to suggest?
I have an audio app that plays for many hours throughout the night. Sometimes the iPhone changes routes and says the speaker is no longer available and stops playing audio. I'm not sure why but my theory is that it tried to connect to a Bluetooth device. I get the following in the log:
AVAudioSessionRouteChangeReasonCategoryChange
AVAudioSessionRouteChangeReasonOldDeviceUnavailable
And since the old device is no longer available it stops the audio. I'm playing audio through the iPhone speaker, so I'm not sure why the speaker would no longer be be available.
My question is what can I do to prevent this from happening in code? I advise people to turn on Airplane mode or disable Bluetooth which always fixes the issue. I'm wondering if something could be done in code to help solve this problem.
I'm writing mobile application with Adobe AIR. The application use AIR Microphone API to record sound to file and later replay it.
The problem manifest only on mobile devices, not simulator. Specifically only on iOS devices, android devices seem to work OK.
Sometimes the recorded sound is missing samples. I know this because I use iFunBox to copy the recorded file to another application that replay it. The dropped frame manifest during playback as very fast audio because only part of the samples were recorded.
Sometime the playback is to slow which manifest as very slow audio. I know this because when the recording is fine and the other application play the sound right or when I take a file I recorded in the simulator (which run on my MacBook) and it only play slow on the mobile device.
How can I make sure the sound is good even when the application is a bit busy?
I've built the application as ad-hoc package and install it on iPad using TestFlight and now everything seem to work just fine.
I guess during debug Adobe AIR did not manage to fill the sound buffer fast enough and cause the distortion.
I have tried to live stream audio (AAC-LC) from iOS for 3 months without much success...
I tried Audio Queues, which work well but there is a strange delay (~4s) and I don't know why (high level API ?)
I tried Audio Units, it sometimes works on the simulator but never with the phone using a modified code from this source
I am really lost, can anyone help me ?
EDIT
I have to do a live streaming application (iPhone-> Wowza Server via RTSP). The video part works well with little delay (1s). Now I'm trying to add audio in addition to video but I'm stuck with the SDK.
tldr : I need to capture microphone input then send AAC frames over the network without getting huge delay
This app, which I just now completed, broadcasts audio between any two iOS devices on the same network:
https://drive.google.com/open?id=1tKgVl0X92SYvgpvbljRzilXNQ6iBcjqM
Compile it with the latest beta release of Xcode 9, and run it on two iOS 11 (beta) devices.
The app is simple; you launch it, and then start talking. Everything is automatic, from network connectivity to audio streaming.
Events generated by the app are displayed in an event log in the app:
Even though the code is simple and concise, the event log was provided to make understanding the app's architecture quicker and more easily.