iOS AVPlayer EXC_BAD_ACCESS when accessing properties - ios

I'm working on a plugin for Apache Cordova that will allow audio streaming from a remote URL through the Media API.
The issue that I'm experiencing is that I get an EXC_BAD_ACCESS signal whenever I try to access certain properties on the AVPlayer instance. currentTime and isPlaying are the worst offenders. The player will be playing sound through the speakers, but as soon as my code reaches a player.currentTime or [player currentTime] it throws the bad access signal.
[player play];
double position = round([player duration] * 1000) / 1000;
[player currentTime]; //This will cause the signal
I am using ARC, so I'm not releasing anything that shouldn't be.
Edit:
Everything that I've done has been hacking around on the Cordova 3 CDVSound class as a proof of concept for actual streaming on iOS.
The original code can be found here: https://github.com/apache/cordova-plugin-media/tree/master/src/ios
My code can be found here:
CDVSound.h
CDVSound.m
The method startPlayingAudio will trip an exc_bad_access at line 346. Removing line 346 will cause audio to play, but it will trip a bad access later down the road when getCurrentPositionAudio and line 532 is called.
Edit / Solution
So it turns out that the best way to handle this is to use a AVPlayerItem and then access it with player.currentItem.currentTime. The real question then becomes, why isn't this behavior documented with AVPlayer and why does it behave like this?

Related

CHHapticEngine: Properly stop engine after a playback

I was trying to play a haptic "AHAP" pattern from a file with the following code:
__strong static CHHapticEngine *engine;
engine = [[CHHapticEngine alloc] initAndReturnError:nil];
[engine startAndReturnError:nil];
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
The haptics does play successfully, but I have an issue that whenever was pattern is played, the first key press on keyboard make very loud flick sound (the default iOS flick sound) and subsequently back to normal on second press. I thought it was because the engine is still active, so I called
[engine stopWithCompletionHandler:nil];
but then the haptic doesn't play anymore (however, flick sound is normal for first key press). playPatternFromURL:error: is supposed to play synchronously, which means it'll finish playing before executing stopWithCompletionHandler: (from Apple Doc). I honestly has no idea why and how this happens. CoreHaptics rarely can be seen implemented in the wild and github except the official Apple Doc, so I have no useful references (maybe except this in github).
Any idea on this particular issue? Thanks in advance.
EDIT:
For future reader, I managed to mitigate this issue by playing it in another thread:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
});
Perhaps this is due to the Frameworks being a beta software as of writing.
EDIT 2:
Above mitigation however doesn't solve it if you have CHHapticEventTypeAudioCustom
I managed to solve it using these codes below:
__strong static CHHapticEngine *engine;
engine = [[CHHapticEngine alloc] initAndReturnError:nil];
[engine startAndReturnError:nil];
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
[engine notifyWhenPlayersFinished:^CHHapticEngineFinishedAction(NSError * _Nullable error) {
[engine stopWithCompletionHandler:nil];
return CHHapticEngineFinishedActionStopEngine;
}];
It seems like I needs to observe whenever the pattern stopped playing and stop the engine (not due to the framework being a beta software, my bad). However, for the method playPatternFromURL:error:, quoting from Apple Doc:
This method blocks processing on the current thread until the pattern
has finished playing.
doesn't seems to means what it means, at least to my understanding. That's why calling stopWithCompletionHandler: right after playPatternFromURL:error: failed to trigger any haptics.
Solution:
engine.playsHapticsOnly = YES;

AudioQueueStart return -66681

When I use <AudioToolbox/AudioServices.h> to implement a recording function, sometimes there will be unsuccessful recording. The reason is AudioQueueStart returned value -66681. The document says:
The audio queue has encountered a problem and cannot start
I found documents but I have no idea about that.

Is it possible to get audio from an ICY stream with percentage and seek function

I'm trying to reproduce audio from an ICY stream. I'm able to reproduce that with AVPlayer and some good open source library but I'm not able to control the stream. I have no idea how I can get the percentage reproduced or how to seek to a specific time in the stream. Is that possible? Is there a good library that can help me?
Actually I'm using AFSoundManager but I'm always receiving negative numbers for percentage and I get invalid time when trying to seek the stream at a specified time.
That's the code that I'm using:
AFSoundManager.sharedManager().startStreamingRemoteAudioFromURL("http://www.abstractpath.com/files/audiosamples/sample.mp3") { (percentage, elapsedTime, timeRemaining, error, poppi) in
if error == nil {
//This block will be fired when the audio progress increases in 1%
if elapsedTime > 0 {
println(elapsedTime)
self.slider.value = Float(elapsedTime*1000)
}
} else {
//Handle the error
println(error)
}
I'm able of course to get the elapsedTime but not the percentage or the remainingTime. I always get negative numbers.
This code works perfectly with remote or local audio file but not with the stream.
This isn't possible.
These streams are live. There is nothing to seek to because what you haven't heard hasn't happened yet. Even streams that playback music end-to-end are still "live" in the sense that the audio you haven't received hasn't been encoded yet. (Small codec and transit buffers aside, of course.)

AVPlayerItem strange buffer observer

I'm using AVPlayer to play streamed network audio. I observer status of the streamed item as this post:
ios avplayer trigger streaming is out of buffer
Work seems well, but I encounter a strange problem. I meet the key "playbackLikelyToKeepUp" before the key "playbackBufferEmpty". I placed a log
NSLog(#"___path: %#", path)
in the first line of the function
observeValueForKeyPath.....
and the log I received is:
...
2012-10-29 17:24:35.412 NhacSo[236:907] ___path: rate
2012-10-29 17:24:35.413 NhacSo[236:907] ___path: playbackLikelyToKeepUp
2012-10-29 17:24:35.415 NhacSo[236:907] ___path: playbackBufferEmpty
2012-10-29 17:24:35.416 NhacSo[236:907] ___path: rate
...
Do you know why I receive "playbackLikelyToKeepUp" before "playbackBufferEmpty"? Thank you!!!
You receive playbackLikelyToKeepUp first because that property changes first. What I believe is confusing you is that it changes from YES to NO and not the other way around - that is, playback will no longer be able to keep up.

recording against a metronome of set length using remote IO

I was able to create the exact functionality I wanted to avaudioplayer and avaudiorecorder but of course experienced latency problems. So after reading pretty much every article on the web and reviewing stacks of sample code, I'm still not sure how to achieve the following:
User chooses to record a sample 2 bars long (4 beats per bar) with a pre-roll/count-in
User clicks record
A metronome starts which counts in 4 beats (accent on the first beat)
The app automatically starts recording on the start of the next bar
The app automatically turns off recording at the end of the 3rd bar (the 2 bars + the pre-roll)
The user can then playback their recording or delete it and start again.
So, with avaudioplayer and avaudiorecorder I simply created a 'caf' using audacity with a metronome set at the correct bpm (bpm is set for the app). I then setup and play the avaudioplayer and using the audiodidfinishsuccessfully delegate method, performed some logic to start the recorder, restart the player, maintain a loop count etc. to turn off recording and audio.
As I mentioned, I was pretty much able to achieve the user experience I am after but the latency problems are not acceptable.
I have been working with audio units and the remote IO and have setup a project with a playback callback and recorder callback etc. but now face the problem of working how to make this work based on the description above. I am trying to work out the following things for starters:
If I create a 1 beat caf file, how could I make use of audio units and remote IO to play x amount of beats and then stop?
How could I do the pre-roll and start the recording callback after 4 beats
Can anyone give me some ideas or point me in the right direction. As I have mentioned, I have already done a stack of research including buying the core audio book, reading every article on atastypixel.com, timbolstad.com etc and trawled through the apple docs.
Thanks in advance for your help.
I start an NSTimer. Use values based on BPM (Beats per Minute) / 60. So if user wants to record a 2 bar file with a count in might do something like this:
//timer interval=100BPM/60secs per minute
timerInterval=100/60;
metroTimer = [NSTimer scheduledTimerWithTimeInterval:timerinterval target:self selector:#selector(blinkMetroLight) userInfo:nil repeats:YES];
- (void)blinkMetroLight
{
if(beatNumber == 0)
{
beatNumber = 1;
}
else if (beatNumber == 5)
{
[self audioProcessorStart];
}
if (beatNumber == 8)
{
[self audioProcessorStop];
[metroTimer invalidate]; metroTimer = nil;
}
beatNumber++
}

Resources