Playing short sound with SystemSound or AVAudioPlayer on iOS - ios

Creating custom spinner for iOS and for each rotation change need to play sound. Sound itself is 0.5 sec. According to Apple's documentation I used SystemSound to play it but now I can't control volume with hardware buttons cause audio context is Volume(media) and as I understood SystemSound works in different context, Ringer.
I tried to use AVAudioPlayer but there is a small delay when sound is attaching to the player and this glitches UI(small delay so spinner doesn't rotate smoothly).
Any ideas how this can be solved ?
I was thinking about somehow programmaticaly change this context by triggering controls that are working with systemSound, but no luck.
EDITED:
I've made sure that have only 1 instance of AVAudioPlayer and didn;t stop playing sound but paused it and only after spinner finished rotating then stop and dispose, but still same result

Related

Best way to play silence using AVAudioPlayer on iOS

I found myself in a situation where I need to simulate audio playback to trick OS controls and MPNowPlayingInfoCenter into thinking that an audio is being played. This is because I am building a player that plays multiple audio tracks, with pauses in-between creating one, continuous "audio" track. I have already everything setup inside the app itself, and the lock screen controls are working correctly but the only problem I am facing is while the actual audio stops and a pause is being "played", the lock screen info center stops the timer, and it only continues with showing correct time and overall state once another audio track starts playing.
Here is the example of my audio track built from audio files and pause items:
let items: [AudioItem] = [
.audio("part-1.mp3"),
.pause(duration: 5), // value of type: TimeInterval
.audio("part-2.mp3"),
.pause(duration: 3),
... // the list goes on
]
then in my custom player, once AVAudioPlayer finishes its job with current item, I get the next one from the array and play either a .pause with a scheduled Timer or another .audio with AVAudioPlayer.
extension Player: AVAudioPlayerDelegate {
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
playNextItem()
}
}
And here lies the problem, once the AVAudioPlayer stops, the Now Playing info center automatically stops too, even tho I keep feeding it fresh nowPlayingInfo. Then when it hits another .audio item, it resumes correctly and shows current time, etc.
And here lies the question
how do I trick the MPNowPlayingInfoCenter into thinking that audio is being played while I "play" my .pause item?
I realise that it may still not be clear, what I am trying to achieve but I am happy to share more insight if needed. Thanks!
Some solutions I am currently thinking about:
A. Keeping 1s long empty audio track that would play on loop for as long as the pause is needed to play.
B. Creating programatically empty audio track with appropriate lenght and playing it instead of using Timer for keeping track of pause duration/progress and relying completely on AVAudioPlayer for both .audio and .pause items. Not sure this is possible though.
C. Maybe there is a way to tell the MPNowPlayingInfoCenter that the audio keeps playing without the need of using AVAudioPlayer but some API I am not familiar with?
AVAudioPlayer is probably the wrong tool here. You want AVAudioPlayerNode, which is slightly lower-level. Create an AVAudioEngine, and attach an AVAudioPlayerNode. You can then call scheduleFile(_:at:completionHandler:) to play the audio at the times you want.
Much of the Apple documentation on AVAudioEngine appears broken right this moment, but the links hopefully will be available again shortly in the links for Audio Engine Building Blocks. (If it stays down and you have trouble finding docs, leave a comment and I'll hunt down the WWDC videos and other tutorials on using AVAudioEngine. It's not particularly difficult for simple problems.)
If you know in advance how you want to compose these items (and it looks like you may), see also AVMutableComposition, which lets you glue together assets very efficiently, including adding empty segments of silence. See Media Composition and Editing for the various tools in that space.

AVAudioPlayer no sound when device is muted

How is the easiest way to program the AVAudioPlayer in the way that it is not playing music, while the device is muted(the physical slider on the left hand side is switched towards back of the phone)? It should just play music, if the device is unmuted.
you can have a slider that controls the volume for your AVAudioplayer, and an outlet for the slider that gets the value of the slider. When the slider value is set to zero that means when the volume is zero just use "youplayername.pause()" to pause the audio and when the volume is not zero, you can have two cases i.e. the audio is playing or the audio is paused. so you can set a boolean for condition of audio playing and code it accordingly. Use nested if else to achieve this result. I hope this answer helps you.

Transitioning to background, AVPlayer Video has small audio gap

I have followed the many helpful previous questions to get my AVPlayer successfully streaming video when my app goes to the background. There are two methods described on Apple's QA1668 and they both work for my stream urls.
The problem is that there is a noticeable audio gap during the transition that is identical for both methods. On my iPhone 6 in release mode I would say the gap is less than 0.5 seconds, which may not seem terrible but if I'm playing something like a music video this is very distracting.
After more testing it looks like this gap actually occurs when I remove the AVPlayerLayer (or, if I am using the other method, when I disable the AVMediaCharacteristicVisual tracks) as I have determined it will still happen if I hook those actions up to a button rather than the backgrounding state.
My guess is that is has something to do with the audio re-syncing to the new video state of the AVPlayer but really I have no clue. Any help would be greatly appreciated!

MPMoviePlayerController play/pause toggle issue

I have used MPMoviePlayerController in my application to play selected video. There is also facility to start video from different location like from 10 min, 20 min, 40 min etc.
Problem in iOS 7:
The problem is when video is played I am able to pause it but it does not resume after pausing. And also the pause button does not turn into play button. After clicking pause button none of the notification like "MPMoviePlayerPlaybackStateDidChangeNotification" get invoked. And same problem occurs when video is playing in fullscreen mode.
Problem in iOS 6:
Here only problem is that pause button does not turn into play button. Here video pause and resume works properly in fullscreenmode also.
one strange behaviour:
For one video when I play it from 40 min it just works perfectly. None of the above issues occur for it. But same video does not work when played from starting or 10,20 min durations.
I have searched a lot but I found only one post here related to this issue. But this solution is not working for me.
Does anybody know how to solve this?
Use AVPlayer as MPMoviePlayer is now deprecated.

How should I go about synchronizing sound effects with UI animations in iOS?

I want to play some sound effects while animating an UI element (e.g. playing a movement sound while an UI object is moving), which requires precise timing and synchronization.
I really can't figure out which framework I should be using from the descriptions in the Multimedia Programming Guide. So I need your kind help in choosing one.
What I want to do is:
Play short (max 10 seconds) sound effects (e.g. a button tap sound).
Being able to synchronize some of them with UI animations (e.g. a view appearance/disappearance).
I tried using the AudioServicesPlaySystemSound function from the AudioToolbox framework, sometimes it works great, but sometimes the sound won't play instantly. For example when a button is clicked, its action is performed before the sound is played, even though the AudioServicesPlaySystemSound is called first in the button's action method.
Thanks in advance,
Mota
Mixing audio waveforms into an already running RemoteIO Audio Unit configured with short buffers will have the lowest possible audio latency. The cost of this is a more complex to use API and the need for uncompressed audio assets.

Resources