I am trying to enable "Designed for iPad" as a target for my functioning Flutter based iOS app. My understanding is that the app runs the iPad version in a VM when the user is on an M1 (or Apple Silicon) mac.
The app uses just_audio, all features work fine and I can load a file and play (I am using a StreamAudioSource derived from an encrypted file downloaded on disk). When I try to seek in the file I get this error from the iOS side and the audio does not seek but continues playing from where it was (I see a buffering and ready state).
AQMEIO_HAL.cpp:736 kAudioDevicePropertyMute returned err 2003332927
There is very little info on this error apart from this pretty old Apple developer post
https://developer.apple.com/forums/thread/672311
If I play a file as a stream (encrypted HLS) then seek works exactly as expected.
Has anyone had any experience with just_audio targeted this way?
Thanks!
Related
I'm writing mobile application with Adobe AIR. The application use AIR Microphone API to record sound to file and later replay it.
The problem manifest only on mobile devices, not simulator. Specifically only on iOS devices, android devices seem to work OK.
Sometimes the recorded sound is missing samples. I know this because I use iFunBox to copy the recorded file to another application that replay it. The dropped frame manifest during playback as very fast audio because only part of the samples were recorded.
Sometime the playback is to slow which manifest as very slow audio. I know this because when the recording is fine and the other application play the sound right or when I take a file I recorded in the simulator (which run on my MacBook) and it only play slow on the mobile device.
How can I make sure the sound is good even when the application is a bit busy?
I've built the application as ad-hoc package and install it on iPad using TestFlight and now everything seem to work just fine.
I guess during debug Adobe AIR did not manage to fill the sound buffer fast enough and cause the distortion.
I have tried to live stream audio (AAC-LC) from iOS for 3 months without much success...
I tried Audio Queues, which work well but there is a strange delay (~4s) and I don't know why (high level API ?)
I tried Audio Units, it sometimes works on the simulator but never with the phone using a modified code from this source
I am really lost, can anyone help me ?
EDIT
I have to do a live streaming application (iPhone-> Wowza Server via RTSP). The video part works well with little delay (1s). Now I'm trying to add audio in addition to video but I'm stuck with the SDK.
tldr : I need to capture microphone input then send AAC frames over the network without getting huge delay
This app, which I just now completed, broadcasts audio between any two iOS devices on the same network:
https://drive.google.com/open?id=1tKgVl0X92SYvgpvbljRzilXNQ6iBcjqM
Compile it with the latest beta release of Xcode 9, and run it on two iOS 11 (beta) devices.
The app is simple; you launch it, and then start talking. Everything is automatic, from network connectivity to audio streaming.
Events generated by the app are displayed in an event log in the app:
Even though the code is simple and concise, the event log was provided to make understanding the app's architecture quicker and more easily.
I want to do this with my app and device:
Connect VGA, DVI, or HDMI monitor to appropriate 30-pin adapter cable
Connect adapter to iPhone4/iPad2
Mirror iDevice screen content on monitor screen.
This all used to "just work" on iOS 5. There's even proof of it working in a YouTube video of our demo from last year (see http://youtu.be/xjKk1EJ1yAI, to skip to the image jump to 1:40 in the video and hit play). All I did was plug in the iPad and voila there it was...
This now seems to "just not work." Do I have to write a whole bunch of code to make this work with every app I build or is there a setting in the Info.plist file that I should set? Should I build for iOS 5.1 to make it work again, even though the device runs iOS 6+?
I will write the code if I have to, but I'd be a lot happier if it just worked again. The "similar questions" feeature on SO was helpful in letting me know that I'm perhaps out of luck and pointing to some solutions, like Rob Terrell's TVOutManager, but I wasn't able to find an answer to this specific question in that list.
EDIT: I only tried this with an iPhone 4 running iOS 6.0.1 and the VGA adapter. I haven't tried again with an iPad (also with iOS 6.0.1), and the video was using the HDMI adapter. I will try those and update accordingly (but probably not until Monday for those updates).
I'm fiddling with an app, and I'm also aware that apps made by developers that allow an iOS Device to receive an audio stream from another iOS Device or iTunes. So I'd like to implement it and possibly find a method within Apple's guidelines, allowing audio to be streamed. I've tried looking for everything, but I can't find where to start. Any ideas, a place to start, maybe even a point of direction would be great.
chekout the airspeaker project on github.
https://github.com/chenkaigithub/AirSpeaker
I was able to run it on iOS simulator 6.0 and then use my iphone to stream audio.
However if I try to stream from iTunes 11 it does not work ( iTunes lists the device in airplay list but , on selection , prompts with error "airplay device is not compatible with this version of iTunes." )
When I try to play video from m.youtube.com I get an error:
an error has occured attempting to play media.
How do I solve this?
Make sure you are running the MDS Simulator (start it before the simulator), otherwise most applications that use networking on the simulator (including the web browser) will not work.
EDIT: as noted below, YouTube mobile uses RTSP which isn't supported in the simulators
Try playing around with different versions of the OS in your emulator. Be aware, however, that Blackberry support for Flash video is virtually nonexistent even on their handsets, so you're bound to run into issues.