In my app I am streaming audio and there is a period of 5-10 sec depending on the connection where the buffer is loading and after this, my app starts to play the audio. When it starts to play the audio this symbol comes up in the screen.
Here is an image of what im talking about.
http://img27.imageshack.us/img27/3667/img0596.png
I want to change a label in my app when this symbol comes up in the screen, but i dont know which function let me detect this.
The symbol is the "Play" button common to music devices. There is most likely an NSNotification center message that can be "listened for". However, depending on how you are buffering your sounds there is probably a delegate that can notify a selector once it has begun playback. Without more details I can not give more detailed advice. If I were in your position I would take a very hard look at the API you are utilizing, most likely several methods exist to either post notification or send delegate messages notifying the state of the stream as well as playback. I have worked with some streaming audio API and I was able to get status of the buffer as well many other messages from the stream object(s). These are just part of good design, so most likely it is there.
Related
I am trying to build an iOS app which controls a music player which runs on a seperate machine. I would like to use the MPNowPlayingInfoCenter for inspecting and controlling this player. As far as I can tell so far, the app actually has to output audio for this to work (see also this answer).
However, for instance, the Spotify app is actually capable of doing this without playing audio on the iOS device. If you use Spotify Connect to play the audio on a different device, the MPNowPlayingInfoCenter still displays the correct song and the controls are functional.
What's the catch here? What does one (conceptually) have to do to achieve this? I can think of continuously emitting a "silent" audio stream, but that seams a bit brute-force.
Streaming silence will work, but you don't need to stream it all the time. Just long enough to send your Now Playing info. Using AVAudioPlayer, I've found approaches as short as this will send the data (assuming the player is loaded with a 1s silent audio file):
player.play()
let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default()
nowPlayingInfoCenter.nowPlayingInfo = [...]
player.stop()
I was very surprised this worked within a single event loop. Any other approach to playing silence seems to work as well. Again, it can be very brief (in fact the above code in my tests doesn't even get as far as playing the silence).
I'm very interested in whether this works reliably for other people, so please comment if you make discoveries about it.
I've explored the Spotify app a bit. I'm not 100% certain if this is the same technique they're using. They're able to mix with other audio somehow. So you can be playing local audio on the phone and also playing Spotify Connect to some other device, and the "Now Playing" info will kind of stomp on each other. For my purposes, that would actually be better, but I haven't been able to duplicate it. I still have to make the audio session non-mixable for at least a moment (so you get ~ 1s audio drop if you're playing local audio at the same time). I did find that the Spotify app was not highly reliable about playing to my Connect device when I was also playing other audio locally. Sometimes it would get confused and switch around where it wanted to play. (Admittedly this is a bizarre use case; I was intentionally trying to break it to figure out how they're doing it.)
EDIT: I don't believe Spotify is actually doing anything special here. I think they just play silent audio. I think getting two streams playing at the same time has to do with AirPlay rather than Spotify (I can't make it happen with Bluetooth, only AirPlay).
I have a CoreAudio based player that streams remote mp3s.
It uses NSURLConnection to retrieve the mp3 data -> uses AudioConverter to convert the stream into PCM -> and feeds the stream into an AUGraph to play audio.
The player works completely fine in my demo app(it only contains a play button), but when i add the player to another project, but when coupled with a project that already makes networking calls, and updates UI, the player fails to play audio past a few seconds.
Am possibly experiencing a threading issue? What are some preventative approaches that i can take or look into that can prevent this from happening?
You do not mention anything in your software architecture about buffering your data between receiving it via NSURLConnection and when you send it to your player.
Data will arrive in chunks with inconsistent arrival rates.
Please see these answers I posted regarding buffering and network jitter.
Network jitter
and
Network jitter and buffering queue
In a nutshell, you can receive data and immediately send it to your player because the next data may not arrive in time.
You don't mention the rate that the mp3 file is delivered. If it is delivered very quickly over a fast connection... are you buffering all of the data received or is it getting lost somewhere in your app? There is a chance that your problem is that you are receiving way too much data too fast and not properly buffering up the data received.
I am using MPMusicPlayerController so my app can play music that the user has bought through iTunes. When I select a song and start to play, there is a lag before the sound starts. I am assuming that the song is being buffered from the cloud.
The problem is that I have not found a way to know when the buffering is complete and the audio actually starts.
To play the song I use:
_mediaController = [MPMusicPlayerController applicationMusicPlayer];
[_mediaController setQueueWithItemCollection:collection];
[_mediaController beginGeneratingPlaybackNotifications];
[_mediaController play];
As soon as I call "play", the playback state change notification is called and the playback state is "MPMusicPlaybackStatePlaying" even though the user can't hear any music. I've noticed that even though the mediaController is in the "playing" playback state, the _mediaController.currentPlaybackTime always equals 0 until the music can be heard at which time the currentPlaybackTime properly tracks with the music.
I thought I could use the [_mediaController prepareToPlay] method to preload the audio file, but when I use that, the MPMediaPlaybackIsPreparedToPlayDidChangeNotification notification is never called. So the mediaController is never marked as "prepared".
With all this being said, I have not found a way to prebuffer songs using the MPMusicPlayerController. I know this issue has been around for a while because there is an old question from a few years ago with essentially the same issue, but no answer. Does anyone know how to make this work?
MPMediaPlaybackIsPreparedToPlayDidChangeNotification looks deprecated.
The MPMediaPlayerController getters and notifications are kind of garbage, and aren't "consistent" at all in the way where you set a value and expect the same value to come back when you grab it again.
I solved this by "buffering" the song first, as my app frequently will start in the middle of a song. So my buffering algo is - I'll play the song, then wait for the playback state changed notification, then pause it again and wait for another notifcation. This process will without a doubt trigger the MPMusicPlayerControllerNowPlayingItemDidChangeNotification, then finally the song is ready to play or be mutated (set currentTime or Rate). This seems to work very well.
The prepareToPlay completion handler is also garbage. It seems to fire before the song is actually ready to play, and the method actually seems to start playback :( , which is more than it leads on. This seems to be commonly reported as a "bug" in the apple dev forums.
In the prepareToPlay callback, setting currentPlaybackTime or Rate won't actually mutate the player - you need to wait for the additional MPMusicPlayerControllerNowPlayingItemDidChangeNotification sometime after playback starts for the first time on the song before you'll have any luck mutating any of the player properties.
currentPlaybackRate and Time also aren't very reliable, although they are more reliable once playback actually starts. I'll cache it to what the user sets it to until playback starts to solve the consistency problem. There is a lot more here to get the currentPlaybackRate or time consistently, let me know if you want a code sample, as grabbing those properties for reading will yield different results depending on the executing thread :(
Checking the playback state isn't reliable either unfortunately. Often I've found that the player reports MPMusicPlaybackStatePlaying when its fully paused and not playing - it stays like this indefinitely in certain situations. I recommend abstracting out any determinations if the MPMediaPlayerController is actually playing based on calls to play or pause and a subsequent follow-up of a MPMusicPlayerControllerPlaybackStateDidChangeNotification confirming the player has started that action.
I should mention I'm on iOS 12.1.3, as Apple seems to intermittently fix a small bug or add more API breakages as time goes on. Since its pretty broken now, any improvements may break any abstraction layer you build to fix it - so I'm testing each iOS release to make sure everything still "works".
Hope this helps a bit, its a real struggle.
I've found a few related questions for Android but nothing for iOS.
Is there any possible way to override the phone's microphone once a phone call has been received and playback an audio file over the phone call? If it's not possible to override the microphone, is there a way to mix in an audio file along with the microphone?
I don't believe you can do what you're wanting. From Apple's Audio Session Programming Guide:
The system follows the inviolable rule that “the phone always wins.” No app, no matter how vehemently it demands priority, can trump the phone. When a call arrives, the user gets notified and your app is interrupted—no matter what audio operation you have in progress and no matter what category you have set.
Which, if you think about it, makes sense: A user is unlikely to want unexpected audio to interrupt or overlay a phone conversation.
I am using AVPlayer to play an audio stream, and it's possible to keep it playing in the background. I'm wondering how could I handle a situtation where the user loses internet connectivity, so that I could provide some feedback or maybe try to re-establish the playback after some seconds.
EDIT: I know that the question regards AVPlayer, but if you have an answer with MPMoviePlayerController it might be useful as well. Right now, by using MPMoviePlayerController, I'm trying to get the MPMovieFinishReasonPlaybackError case of the MPMoviePlayerPlaybackDidFinishReasonUserInfoKey, by subscribing to the MPMoviePlayerPlaybackDidFinishNotification but if f.e. my audio is playing in the background and I turn airplane mode on, I never get this notification; I only get MPMovieFinishReasonPlaybackEnded, and I don't know how to separate that from the case that the user stops the audio himself.
I tried looking around for the actual source but I remember reading somewhere that if the audio playback stops (for whatever reason) it kills the background thread. The person writing about the issue talked about possible feeding the stream some empty audio content to keep the thread alive. You might be able to send a local notification from a call back error notifying the user that the audio experienced an error and will have to be manually restarted from within the application. Haven't played around with the API enough to know which callback is the best one to use in this case. If I find the link I'm looking for I'll update.
EDIT: Here's Grant Pannell's take on audio streaming and multitasking.