How to save the position of the audio file? objective -C - ios

How can I save time when audio was stopped in session and continue playback from the stop point in next session?
My code:
- (void)initPlayer:(NSString*) audioFile fileExtension:(NSString*)fileExtension
{
NSURL *audioFileLocationURL = [[NSBundle mainBundle] URLForResource:audioFile withExtension:fileExtension];
NSError *error;
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileLocationURL error:&error];
if ([audioFile isEqualToString:#"2"]) {
_index = 1;
}
else if ([audioFile isEqualToString:#"3"]) {
_index = 2;
}
[self song];
}
- (void)playAudio {
[self.audioPlayer play];
}
- (void)pauseAudio {
[self.audioPlayer pause];
}
- (BOOL)isPlaying {
return [self.audioPlayer isPlaying];
}
-(NSString*)timeFormat:(float)value{
float minutes = floor(lroundf(value)/60);
float seconds = lroundf(value) - (minutes * 60);
int roundedSeconds = lroundf(seconds);
int roundedMinutes = lroundf(minutes);
NSString *time = [[NSString alloc]
initWithFormat:#"%d:%02d",
roundedMinutes, roundedSeconds];
return time;
}
- (void)setCurrentAudioTime:(float)value {
[self.audioPlayer setCurrentTime:value];
}
- (NSTimeInterval)getCurrentAudioTime {
return [self.audioPlayer currentTime];
}
- (float)getAudioDuration {
return [self.audioPlayer duration];
}

You can use AVPlayer's currentTime property. It returns the playback time of the current AVPlayerItem.
To restore the playback time in the next session, you can pass the stored time to AVPlayer's seekToTime:
[self.player seekToTime:storedPlaybackTime];
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/doc/uid/TP40009530-CH1-SW2
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/occ/instm/AVPlayer/seekToTime%3a
To persist the CMTime returned by currentTime, you can use the AVFoundation convenience methods provided by NSValue.
To wrap CMTime in an NSValue, use valueWithCMTime:
[NSValue valueWithCMTime:player.currentTime];
To get an CMTime struct from the persisted value, use:
CMTime persistedTime = [storeValue CMTimeValue];
After you wrapped the CMTime struct in a NSValue instance, you can use keyed archiver & NSData to write the time to disk.
NSHipster has a good article about that topic:http://nshipster.com/nscoding/

The most easy way will be to keep a local db with the song name and when it is stopped add that data to the db. Then when the playback resumes later check the local db first if it has any entries in the past. If not continue from starting.
Also make sure that there is no entry made if the song finishes.
Hope this idea helps you...

Related

Slow buffering when streaming multiple remote videos with AVPlayer and AVMutableComposition

[Edit: I was able to figure out a workaround for this, see below.]
I'm trying to stream multiple remote MP4 clips from S3, playing them in a sequence as one continuous video (to enable scrubbing within and between clips) with no stuttering, without explicitly downloading them to the device first. However, I find that the clips buffer very slowly (even on a fast network connection) and have been unable to find an adequate way to address that.
I've been trying to use AVPlayer for this, since AVPlayer with AVMutableComposition plays the supplied video tracks as one continuous track (unlike AVQueuePlayer, which I gather plays each video separately and thus doesn't support continuous scrubbing between the clips).
When I stick one of the assets directly into an AVPlayerItem and play that (with no AVMutableComposition), it buffers fast. But using AVMutableComposition, the video starts stuttering very badly on the second clip (my test case has 6 clips, each around 6 seconds), while the audio keeps going. After it plays through once, it plays perfectly smoothly if I rewind to the beginning, so I assume the problem lies in the buffering.
My current attempt to fix this problem feels convoluted, given that this seems like a rather basic use-case for AVPlayer - I do hope there's a simpler solution to all this that works properly. Somehow I doubt that the buffering player I use below is really necessary, but I'm running out of ideas.
Here's the main code that sets up the AVMutableComposition:
// Build an AVAsset for each of the source URIs
- (void)prepareAssetsForSources:(NSArray *)sources
{
NSMutableArray *assets = [[NSMutableArray alloc] init]; // the assets to be used in the AVMutableComposition
NSMutableArray *offsets = [[NSMutableArray alloc] init]; // for tracking buffering progress
CMTime currentOffset = kCMTimeZero;
for (NSDictionary* source in sources) {
bool isNetwork = [RCTConvert BOOL:[source objectForKey:#"isNetwork"]];
bool isAsset = [RCTConvert BOOL:[source objectForKey:#"isAsset"]];
NSString *uri = [source objectForKey:#"uri"];
NSString *type = [source objectForKey:#"type"];
NSURL *url = isNetwork ?
[NSURL URLWithString:uri] :
[[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:uri ofType:type]];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
currentOffset = CMTimeAdd(currentOffset, asset.duration);
[assets addObject:asset];
[offsets addObject:[NSNumber numberWithFloat:CMTimeGetSeconds(currentOffset)]];
}
_clipAssets = assets;
_clipEndOffsets = offsets;
}
// Called with _clipAssets
- (AVPlayerItem*)playerItemForAssets:(NSMutableArray *)assets
{
AVMutableComposition* composition = [AVMutableComposition composition];
for (AVAsset* asset in assets) {
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), asset.duration);
NSError *editError;
[composition insertTimeRange:editRange
ofAsset:asset
atTime:composition.duration
error:&editError];
}
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:composition];
return playerItem; // this is used to initialize the main player
}
My initial thought was: Since it buffers fast with a vanilla AVPlayerItem, why not maintain a separate buffering player that's loaded with each asset in turn (with no AVMutableComposition) to buffer the assets for the main player?
- (void)startBufferingClips
{
_bufferingPlayerItem = [AVPlayerItem playerItemWithAsset:_clipAssets[0]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayer = [AVPlayer playerWithPlayerItem:_bufferingPlayerItem];
_currentlyBufferingIndex = 0;
}
// called every 250 msecs via an addPeriodicTimeObserverForInterval on the main player
- (void)updateBufferingProgress
{
// If the playable (loaded) range is within 100 milliseconds of the clip
// currently being buffered, load the next clip into the buffering player.
float playableDuration = [[self calculateBufferedDuration] floatValue];
CMTime totalDurationTime = [self playerItemDuration :_bufferingPlayer];
Float64 totalDurationSeconds = CMTimeGetSeconds(totalDurationTime);
bool bufferingComplete = totalDurationSeconds - playableDuration < 0.1;
float bufferedSeconds = [self bufferedSeconds :playableDuration];
float playerTimeSeconds = CMTimeGetSeconds([_player currentTime]);
__block NSUInteger playingClipIndex = 0;
// find the index of _player's currently playing clip
[_clipEndOffsets enumerateObjectsUsingBlock:^(id offset, NSUInteger idx, BOOL *stop) {
if (playerTimeSeconds < [offset floatValue]) {
playingClipIndex = idx;
*stop = YES;
}
}];
// TODO: if bufferedSeconds - playerTimeSeconds <= 0, pause the main player
if (bufferingComplete && _currentlyBufferingIndex < [_clipAssets count] - 1) {
// We're done buffering this clip, load the buffering player with the next asset
_currentlyBufferingIndex += 1;
_bufferingPlayerItem = [AVPlayerItem playerItemWithAsset:_clipAssets[_currentlyBufferingIndex]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayer = [AVPlayer playerWithPlayerItem:_bufferingPlayerItem];
}
}
- (float)bufferedSeconds:(float)playableDuration {
__block float seconds = 0.0; // total duration of clips already buffered
if (_currentlyBufferingIndex > 0) {
[_clipEndOffsets enumerateObjectsUsingBlock:^(id offset, NSUInteger idx, BOOL *stop) {
if (idx + 1 >= _currentlyBufferingIndex) {
seconds = [offset floatValue];
*stop = YES;
}
}];
}
return seconds + playableDuration;
}
- (NSNumber *)calculateBufferedDuration {
AVPlayerItem *video = _bufferingPlayer.currentItem;
if (video.status == AVPlayerItemStatusReadyToPlay) {
__block float longestPlayableRangeSeconds;
[video.loadedTimeRanges enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
CMTimeRange timeRange = [obj CMTimeRangeValue];
float seconds = CMTimeGetSeconds(CMTimeRangeGetEnd(timeRange));
if (seconds > 0.1) {
if (!longestPlayableRangeSeconds) {
longestPlayableRangeSeconds = seconds;
} else if (seconds > longestPlayableRangeSeconds) {
longestPlayableRangeSeconds = seconds;
}
}
}];
Float64 playableDuration = longestPlayableRangeSeconds;
if (playableDuration && playableDuration > 0) {
return [NSNumber numberWithFloat:longestPlayableRangeSeconds];
}
}
return [NSNumber numberWithInteger:0];
}
It initially seemed that this worked like a charm, but then I switched to another set of test clips and then the buffering was very slow again (the buffering player helped, but not enough). It seems like the loadedTimeRanges for the assets as loaded into the buffering player didn't match the loadedTimeRanges for the same assets inside the AVMutableComposition: Even after the loadedTimeRanges for each item loaded into the buffering player indicated that the whole asset had been buffered, the main player's video continued stuttering (while the audio played seamlessly to the end). Again, the playback was seamless after rewinding once the main player had run through all the clips once.
I hope the answer to this, whatever it is, will prove useful as a starting point for other iOS developers trying to implement this basic use-case. Thanks!
Edit: Since I posted this question, I made the following workaround for this. Hopefully this will save whoever runs into this some headache.
What I ended up doing was maintaining two buffering players (both AVPlayers) that started buffering the first two clips, moving on to the lowest-indexed unbuffered clip after their loadedTimeRanges indicated that buffering for their current clip was done. I made the logic pause/unpause playback based on the clips currently buffered, and the loadedTimeRanges of the buffering players, plus a small margin. This needed a few bookkeeping variables, but wasn't too complicated.
This is how the buffering players were initialized (I'm omitting the bookkeeping logic here):
- (void)startBufferingClips
{
_bufferingPlayerItemA = [AVPlayerItem playerItemWithAsset:_clipAssets[0]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayerA = [AVPlayer playerWithPlayerItem:_bufferingPlayerItemA];
_currentlyBufferingIndexA = [NSNumber numberWithInt:0];
if ([_clipAssets count] > 1) {
_bufferingPlayerItemB = [AVPlayerItem playerItemWithAsset:_clipAssets[1]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayerB = [AVPlayer playerWithPlayerItem:_bufferingPlayerItemB];
_currentlyBufferingIndexB = [NSNumber numberWithInt:1];
_nextIndexToBuffer = [NSNumber numberWithInt:2];
} else {
_nextIndexToBuffer = [NSNumber numberWithInt:1];
}
}
In addition, I needed to make sure that the video and audio tracks weren't being merged as they were added to AVMutableComposition, as this apparently interfered with the buffering (perhaps they didn't register as the same video/audio tracks as those the buffering players were loading, and thus didn't receive the new data). Here's the code where the AVMutableComposition is built from an array of NSAssets:
- (AVPlayerItem*)playerItemForAssets:(NSMutableArray *)assets
{
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime timeOffset = kCMTimeZero;
for (AVAsset* asset in assets) {
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), asset.duration);
NSError *editError;
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
if ([videoTracks count] > 0) {
AVAssetTrack *videoTrack = [videoTracks objectAtIndex:0];
[compVideoTrack insertTimeRange:editRange
ofTrack:videoTrack
atTime:timeOffset
error:&editError];
}
if ([audioTracks count] > 0) {
AVAssetTrack *audioTrack = [audioTracks objectAtIndex:0];
[compAudioTrack insertTimeRange:editRange
ofTrack:audioTrack
atTime:timeOffset
error:&editError];
}
if ([videoTracks count] > 0 || [audioTracks count] > 0) {
timeOffset = CMTimeAdd(timeOffset, asset.duration);
}
}
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:composition];
return playerItem;
}
With this approach, buffering while using AVMutableComposition for the main player works nice and fast, at least in my setup.

AVPlayer removing a periodicTimeObserver

I'm having difficulty stopping an AVPlayers time observer.
I have an AVPlayer player running like this:
player = [[AVPlayer alloc] initWithURL:[NSURL fileURLWithPath:path]];
then I add an observer
[player addPeriodicTimeObserverForInterval:CMTimeMake(3, 10) queue:NULL usingBlock:^(CMTime time){
NSTimeInterval seconds = CMTimeGetSeconds(time);
NSLog(#"observer called");
for (NSDictionary *item in robotR33) {
NSNumber *time = item[#"time"];
if ( seconds > [time doubleValue] && [time doubleValue] >= [lastTime doubleValue] ) {
// NSLog(#"LastTime: %qi", [lastTime longLongValue]);
lastTime = #(seconds);
NSString *str = item[#"line"];
[weakSelf nextLine:str];
// NSLog(#"item: %qi", [time longLongValue]);
// NSLog(#"Seconds: %f", seconds)
};
}
}];
[player play];
once I am finished with the player I do this:
[player pause];
[player removeTimeObserver:self.timeObserver]
player = nil;
the weird thing is when I put a breakpoint in the code and step through the code using XCode it works. I can see the block stops printing out "observer code"
But when I run the code normally with no breakpoints, I can see that the observer is still running at the same interval after the [player removeTimeObserver] has been called.
Any ideas?
Glad to see weakSelf worked for you ...
In the above I don't see you assigning the result of
[player addPeriodicTimeObserverForInterval ...
to self.timeObserver it may be a typo but it should be ;
self.timeObserver = [player addPeriodicTimeObserverForInterval ...
if you intend to call
[player removeTimeObserver:self.timeObserver];
you also missed the ";" above.

How to play tracks from iPod music library with different volume than system volume?

How can I play music from the ipod music library (like user-defined playlists, etc.) at a different volume than the system volume?
This is for anyone who is trying to play music / playlists from the ipod music library at a different volume than the system volume. There are several posts out there saying that the [MPMusicPlayerController applicationMusicPlayer] can do this, but I have found that anytime I change the volume of the applicationMusicPlayer, the system volume changes too.
There is a more involved method of playing music using the AVAudioPlayer class, but it requires you to copy music files from the ipod library to the application bundle, and that can get tricky when you're playing dynamic things, like user generated playlists. That technique does give you access to the bytes though, and is the way to go if you want to do processing on the data (like a DJ app). Link to that solution HERE.
The solution I went with uses the AVPlayer class, there are several good posts out there about how to do it. This post is basically a composite of several different solutions I found on Stackoverflow and elsewhere.
I have the following Frameworks linked:
AVFoundation
MediaPlayer
AudioToolbox
CoreAudio
CoreMedia
(I'm not sure if all of those are critical, but that's what I have. I have some OpenAL stuff implemented too that I don't show in the following code)
// Presumably in your SoundManage.m file (or whatever you call it) ...
#import <CoreAudio/CoreAudioTypes.h>
#import <AudioToolbox/AudioToolbox.h>
#interface SoundManager()
#property (retain, nonatomic) AVPlayer* audioPlayer;
#property (retain, nonatomic) AVPlayerItem* currentItem;
#property (retain, nonatomic) MPMediaItemCollection* currentPlaylist;
#property (retain, nonatomic) MPMediaItem* currentTrack;
#property (assign, nonatomic) MPMusicPlaybackState currentPlaybackState;
#end
#implementation SoundManager
#synthesize audioPlayer;
#synthesize currentItem = m_currentItem;
#synthesize currentPlaylist;
#synthesize currentTrack;
#synthesize currentPlaybackState;
- (id) init
{
...
//Define an AVPlayer instance
AVPlayer* tempPlayer = [[AVPlayer alloc] init];
self.audioPlayer = tempPlayer;
[tempPlayer release];
...
//load the playlist you want to play
MPMediaItemCollection* playlist = [self getPlaylistWithName: #"emo-pop-unicorn-blood-rage-mix-to-the-max"];
if(playlist)
[self loadPlaylist: playlist];
...
//initialize the playback state
self.currentPlaybackState = MPMusicPlaybackStateStopped;
//start the music playing
[self playMusic];
...
}
//Have a way to get a playlist reference (as an MPMediaItemCollection in this case)
- (MPMediaItemCollection*) getPlaylistWithName:(NSString *)playlistName
{
MPMediaQuery* query = [[MPMediaQuery alloc] init];
MPMediaPropertyPredicate* mediaTypePredicate = [MPMediaPropertyPredicate predicateWithValue: [NSNumber numberWithInteger: MPMediaTypeMusic] forProperty:MPMediaItemPropertyMediaType];
[query addFilterPredicate: mediaTypePredicate];
[query setGroupingType: MPMediaGroupingPlaylist];
NSArray* playlists = [query collections];
[query release];
for(MPMediaItemCollection* testPlaylist in playlists)
{
NSString* testPlaylistName = [testPlaylist valueForProperty: MPMediaPlaylistPropertyName];
if([testPlaylistName isEqualToString: playlistName])
return testPlaylist;
}
return nil;
}
//Override the setter on currentItem so that you can add/remove
//the notification listener that will tell you when the song has completed
- (void) setCurrentItem:(AVPlayerItem *)currentItem
{
if(m_currentItem)
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemDidPlayToEndTimeNotification object:m_currentItem];
[m_currentItem release];
}
if(currentItem)
m_currentItem = [currentItem retain];
else
m_currentItem = nil;
if(m_currentItem)
{
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(handleMusicTrackFinished) name:AVPlayerItemDidPlayToEndTimeNotification object:m_currentItem];
}
}
//handler that gets called when the name:AVPlayerItemDidPlayToEndTimeNotification notification fires
- (void) handleMusicTrackFinished
{
[self skipSongForward]; //or something similar
}
//Have a way to load a playlist
- (void) loadPlaylist:(MPMediaItemCollection *)playlist
{
self.currentPlaylist = playlist;
self.currentTrack = [playlist.items objectAtIndex: 0];
}
//Play the beats, yo
- (void) playMusic
{
//check the current playback state and exit early if we're already playing something
if(self.currentPlaybackState == MPMusicPlaybackStatePlaying)
return;
if(self.currentPlaybackState == MPMusicPlaybackStatePaused)
{
[self.audioPlayer play];
}
else if(self.currentTrack)
{
//Get the system url of the current track, and use that to make an AVAsset object
NSURL* url = [self.currentTrack valueForProperty:MPMediaItemPropertyAssetURL];
AVAsset* asset = [AVURLAsset URLAssetWithURL:url options:nil];
//Get the track object from the asset object - we'll need to trackID to tell the
//AVPlayer that it needs to modify the volume of this track
AVAssetTrack* track = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
//Build the AVPlayerItem - this is where you modify the volume, etc. Not the AVPlayer itself
AVPlayerItem* playerItem = [[AVPlayerItem alloc] initWithAsset: asset]; //initWithURL:url];
self.currentItem = playerItem;
//Set up some audio mix parameters to tell the AVPlayer what to do with this AVPlayerItem
AVMutableAudioMixInputParameters* audioParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioParams setVolume: 0.5 atTime:kCMTimeZero]; //replace 0.5 with your volume
[audioParams setTrackID: track.trackID]; //here's the track id
//Set up the actual AVAudioMix object, which aggregates effects
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters: [NSArray arrayWithObject: audioParams]];
//apply your AVAudioMix object to the AVPlayerItem
[playerItem setAudioMix:audioMix];
//refresh the AVPlayer object, and play the track
[self.audioPlayer replaceCurrentItemWithPlayerItem: playerItem];
[self.audioPlayer play];
}
self.currentPlaybackState = MPMusicPlaybackStatePlaying;
}
- (void) pauseMusic
{
if(self.currentPlaybackState == MPMusicPlaybackStatePaused)
return;
[self.audioPlayer pause];
self.currentPlaybackState = MPMusicPlaybackStatePaused;
}
- (void) skipSongForward
{
//adjust self.currentTrack to be the next object in self.currentPlaylist
//start the new track in a manner similar to that used in -playMusic
}
- (void) skipSongBackward
{
float currentTime = self.audioPlayer.currentItem.currentTime.value / self.audioPlayer.currentItem.currentTime.timescale;
//if we're more than a second into the song, just skip back to the beginning of the current track
if(currentTime > 1.0)
{
[self.audioPlayer seekToTime: CMTimeMake(0, 1)];
}
else
{
//otherwise, adjust self.currentTrack to be the previous object in self.currentPlaylist
//start the new track in a manner similar to that used in -playMusic
}
}
//Set volume mid-song - more or less the same process we used in -playMusic
- (void) setMusicVolume:(float)vol
{
AVPlayerItem* item = self.audioPlayer.currentItem;
AVAssetTrack* track = [[item.asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableAudioMixInputParameters* audioParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioParams setVolume: vol atTime:kCMTimeZero];
[audioParams setTrackID: track.trackID];
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters: [NSArray arrayWithObject: audioParams]];
[item setAudioMix:audioMix];
}
#end
Please forgive any errors you see - let me know in the comments and I'll fix them. Otherwise, I hope this helps if anyone runs into the same challenge I did!
Actually I found a really easy way to do this by loading iPod URL's from MPMusicPlayer, but then doing playback through AVAudioPlayer.
// Get-da iTunes player thing
MPMusicPlayerController* iTunes = [MPMusicPlayerController iPodMusicPlayer];
// whazzong
MPMediaItem *currentSong = [iTunes nowPlayingItem];
// whazzurl
NSURL *currentSongURL = [currentSong valueForProperty:MPMediaItemPropertyAssetURL];
info( "AVAudioPlayer playing %s", [currentSongURL.absoluteString UTF8String] ) ;
// mamme AVAudioPlayer
NSError *err;
avAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:currentSongURL error:&err] ;
if( err!=nil )
{
error( "AVAudioPlayer couldn't load %s", [currentSongURL.absoluteString UTF8String] ) ;
}
avAudioPlayer.numberOfLoops = -1; //infinite
// Play that t
[avAudioPlayer prepareToPlay] ;
[avAudioPlayer play];
[avAudioPlayer setVolume:0.5]; // set the AVAUDIO PLAYER's volume to only 50%. This
// does NOT affect system volume. You can adjust this music volume anywhere else too.

Playing a sequence of sounds without gaps (iPhone)

I thought maybe the fastest way was to go with Sound Services. It is quite efficient, but I need to play sounds in a sequence, not overlapped. Therefore I used a callback method to check when the sound has finished. This cycle produces around 0.3 seconds in lag. I know this sounds very strict, but it is basically the main axis of the program.
EDIT: I now tried using AVAudioPlayer, but I can't play sounds in a sequence without using audioPlayerDidFinishPlaying since that would put me in the same situation as with the callback method of SoundServices.
EDIT2: I think that if I could somehow get to join the parts of the sounds I want to play into a large file, I could get the whole audio file to sound continuously.
EDIT3: I thought this would work, but the audio overlaps:
waitTime = player.deviceCurrentTime;
for (int k = 0; k < [colores count]; k++)
{
player.currentTime = 0;
[player playAtTime:waitTime];
waitTime += player.duration;
}
Thanks
I just tried a technique that I think will work well for you. Build an audio file with your sounds concatenated. Then build some meta data about your sounds like this:
#property (strong, nonatomic) NSMutableDictionary *soundData;
#synthesize soundData=_soundData;
- (void)viewDidLoad {
[super viewDidLoad];
_soundData = [NSMutableDictionary dictionary];
NSArray *sound = [NSArray arrayWithObjects:[NSNumber numberWithFloat:5.0], [NSNumber numberWithFloat:0.5], nil];
[self.soundData setValue:sound forKey:#"soundA"];
sound = [NSArray arrayWithObjects:[NSNumber numberWithFloat:6.0], [NSNumber numberWithFloat:0.5], nil];
[self.soundData setValue:sound forKey:#"soundB"];
sound = [NSArray arrayWithObjects:[NSNumber numberWithFloat:7.0], [NSNumber numberWithFloat:0.5], nil];
[self.soundData setValue:sound forKey:#"soundC"];
}
The first number is the offset of the sound in the file, the second is the duration. Then get your player ready to play like this...
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/audiofile.mp3", [[NSBundle mainBundle] resourcePath]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = -1;
if (audioPlayer == nil)
NSLog(#"%#", [error description]);
else {
[audioPlayer prepareToPlay];
}
}
Then you can build a low-level sound playing method like this ...
- (void)playSound:(NSString *)name withCompletion:(void (^)(void))completion {
NSArray *sound = [self.soundData valueForKey:name];
if (!sound) return;
NSTimeInterval offset = [[sound objectAtIndex:0] floatValue];
NSTimeInterval duration = [[sound objectAtIndex:1] floatValue];
audioPlayer.currentTime = offset;
[audioPlayer play];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, duration * NSEC_PER_SEC), dispatch_get_current_queue(), ^{
[audioPlayer pause];
completion();
});
}
And you can play sounds in rapid combination like this ...
- (IBAction)playAB:(id)sender {
[self playSound:#"soundA" withCompletion:^{
[self playSound:#"soundB" withCompletion:^{}];
}];
}
Rather than nesting blocks, you could build a higher-level method that takes a list of sound names and plays them one after the other, that would look like this:
- (void)playSoundList:(NSArray *)soundNames withCompletion:(void (^)(void))completion {
if (![soundNames count]) return completion();
NSString *firstSound = [soundNames objectAtIndex:0];
NSRange remainingRange = NSMakeRange(1, [soundNames count]-1);
NSArray *remainingSounds = [soundNames subarrayWithRange:remainingRange];
[self playSound:firstSound withCompletion:^{
[self playSoundList:remainingSounds withCompletion:completion];
}];
}
Call it like this...
NSArray *list = [NSArray arrayWithObjects:#"soundB", #"soundC", #"soundA", nil];
[self playSoundList:list withCompletion:^{ NSLog(#"done"); }];
I'm assuming you want to change the sequence or omit sounds sometimes. (Otherwise you would just build the asset with all three sounds in a row and play that).
There might be a better idea out there, but to get things very tight, you could consider producing that concatenated asset, pre-loading it - moving up all the latency to that one load, then seeking around it to change the sound.

iOS - How can i get the playable duration of AVPlayer

The MPMoviePlayerController has a property called playableDuration.
playableDuration The amount of currently playable content (read-only).
#property (nonatomic, readonly) NSTimeInterval playableDuration
For progressively downloaded network content, this property reflects
the amount of content that can be played now.
Is there something similar for AVPlayer?
I can't find anything in the Apple Docs or Google (not even here at Stackoverflow.com)
Thanks in advance.
playableDuration can be roughly implemented by following procedure:
- (NSTimeInterval) playableDuration
{
// use loadedTimeRanges to compute playableDuration.
AVPlayerItem * item = _moviePlayer.currentItem;
if (item.status == AVPlayerItemStatusReadyToPlay) {
NSArray * timeRangeArray = item.loadedTimeRanges;
CMTimeRange aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
double startTime = CMTimeGetSeconds(aTimeRange.start);
double loadedDuration = CMTimeGetSeconds(aTimeRange.duration);
// FIXME: shoule we sum up all sections to have a total playable duration,
// or we just use first section as whole?
NSLog(#"get time range, its start is %f seconds, its duration is %f seconds.", startTime, loadedDuration);
return (NSTimeInterval)(startTime + loadedDuration);
}
else
{
return(CMTimeGetSeconds(kCMTimeInvalid));
}
}
_moviePlayer is your AVPlayer instance, by checking AVPlayerItem's loadedTimeRanges, you can compute a estimated playableDuration.
For videos that has only 1 secion, you can use this procedure; but for multi-section video, you may want to check all time ranges in array of loadedTimeRagnes to get correct answer.
all you need is
self.player.currentItem.asset.duration
simply best
Building on John's Answer…
This is the apparent default behavior of Apple players: "Show the Max Time of the playable range that encloses the current time"
- (NSTimeInterval)currentItemPlayableDuration{
// use loadedTimeRanges to compute playableDuration.
AVPlayerItem * item = self.audioPlayer.currentItem;
if (item.status == AVPlayerItemStatusReadyToPlay) {
NSArray * timeRangeArray = item.loadedTimeRanges;
CMTime currentTime = self.audioPlayer.currentTime;
__block CMTimeRange aTimeRange;
[timeRangeArray enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
if(CMTimeRangeContainsTime(aTimeRange, currentTime))
*stop = YES;
}];
CMTime maxTime = CMTimeRangeGetEnd(aTimeRange);
return CMTimeGetSeconds(maxTime);
}
else
{
return(CMTimeGetSeconds(kCMTimeInvalid));
}
}
You will have to detect when the AVPlayer is ready to play your media file.
Let me know, if you don't know how to do this.
However, once the media file is loaded, you can use this method:
#import <AVFoundation/AVFoundation.h>
/**
* Get the duration for the currently set AVPlayer's item.
*/
- (CMTime)playerItemDuration {
AVPlayerItem *playerItem = [mPlayer currentItem];
if (playerItem.status == AVPlayerItemStatusReadyToPlay) {
return [[playerItem asset] duration];
}
return(kCMTimeInvalid);
}
When you use this method its important to understand (because you're streaming content) that the length value may be invalid or something. So you must check this before using it for processing.
CMTime playerDuration = [self playerItemDuration];
if (CMTIME_IS_INVALID(playerDuration)) {
return;
}
double duration = CMTimeGetSeconds(playerDuration);
Swift version of closes playable duration:
var playableDuration: TimeInterval? {
guard let currentItem = currentItem else { return nil }
guard currentItem.status == .readyToPlay else { return nil }
let timeRangeArray = currentItem.loadedTimeRanges
let currentTime = self.currentTime()
for value in timeRangeArray {
let timeRange = value.timeRangeValue
if CMTimeRangeContainsTime(timeRange, currentTime) {
return CMTimeGetSeconds(CMTimeRangeGetEnd(timeRange))
}
}
guard let timeRange = timeRangeArray.first?.timeRangeValue else { return 0}
let startTime = CMTimeGetSeconds(timeRange.start)
let loadedDuration = CMTimeGetSeconds(timeRange.duration)
return startTime + loadedDuration
}

Resources