How to get the MPMediaItem (audiobook) individual chapters - ios

I know I can get all the audiobooks from the iPod library with:
MPMediaPropertyPredicate *abPredicate =
[MPMediaPropertyPredicate predicateWithValue:[NSNumber numberWithInt:MPMediaTypeAudioBook]
forProperty:MPMediaItemPropertyMediaType];
MPMediaQuery *abQuery = [[MPMediaQuery alloc] init];
[abQuery addFilterPredicate:abPredicate];
[abQuery setGroupingType:MPMediaGroupingAlbum];
NSArray *books = [abQuery collections];
And I can get the parts/files for each book by using this:
[book items];
What I cant figure out is how to get the separate chapters that make up each part.
I know you can see this in the iPod application by tapping the "track" button in the upper right corner while playing a book. This flips the player around and shows the list of chapters.
Is apple using a private API to get this?

To get the individual chapters you need to create an AVAsset from the MPMediaItem's AssetURL property.
NSURL *assetURL = [mediaItem valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
NSArray *locales = [asset availableChapterLocales];
NSArray *chapters = [asset chapterMetadataGroupsWithTitleLocale:locale containingItemsWithCommonKeys:[NSArray arrayWithObject:AVMetadataCommonKeyArtwork]];
Get the url and create the asset, check the available locales for the chapter, and get the chapters from the asset. The result is an array of AVTimedMetadataGroups that each contain a CMTimeRange and an array of AVMetadataItems. Each AVMetadataItem holds a piece of metadata (e.g. chapter title, chapter artwork).
According to the documentation the only supported key is the AVMetadataCommonKeyArtwork.

Related

PHAsset assetResourcesForAsset fails when called too often

I need to retrieve the names of all the PHAsset existing in the Camera Roll, individually and in a short time.
To get the file name, I use the documented originalFilename property of PHAssetResource associated to the PHAsset.
This works fine for the first assets, but at some point (after around 400 assets), it starts failing and returning nil every time.
Here is a code that shows this behavior (running on an iPhone 7 with ~800 photos in the Camera Roll):
PHFetchResult *result = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum
subtype:PHAssetCollectionSubtypeSmartAlbumUserLibrary options:nil];
PHAssetCollection *assetCollection = result.firstObject;
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
for (int i = index; i<[assets count]; i++) {
PHAsset *asset = assets[i];
NSArray *resources = [PHAssetResource assetResourcesForAsset:asset];
NSString *name = (resources.count > 0) ? [(PHAssetResource*)resources.firstObject originalFilename] : nil;
NSLog(#"%i: %#", i, name);
}
When using undocumented methods to get the file name, such as [asset valueForKey#"filenamme"] or the PHImageFileURLKey key of the info dictionary returned by the PHImageManager, everything works well (although the name is different than with the originalFilename and well, it's not reliable since it's not documented).
How come the official method is that unreliable?
Is there something I do wrong?

AVURLAsset can't get mp3 duration in documents directory, but it works well with mp3 in app bundle

I have two mp3 files. One is in app bundle, the other is in user's Documents directory. I want to get the duration of mp3 file.
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:newPath] options:nil];
[audioAsset loadValuesAsynchronouslyForKeys:#[#"duration"] completionHandler:^{
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
NSLog(#"duration:%f",audioDurationSeconds);
}];
Those codes work well with the mp3 file in app bundle, but it doesn't work with the mp3 in Documents directory which only log "duration:0.000000". Why?
The following is a way to compose a path associated to the mp3 file in document directory.
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *path = [documentsDirectory stringByAppendingPathComponent:#"Professor_and_the_Plant.mp3"];
NSURL *mp3_url = [[NSURL alloc] initFileURLWithPath:path];
NSURLAsset defaults to working quickly. If the length or metadata isn't readily available (i.e., in the ID3 header), it doesn't yield it. To demand the correct data, even if it takes longer, you have to give it options:
static NSDictionary *options;
if (!options) {
NSNumber *val = [[NSNumber alloc] initWithBool: YES];
options = [[NSDictionary alloc] initWithObjectsAndKeys:val, AVURLAssetPreferPreciseDurationAndTimingKey, nil];
}
AVAsset *asset = [[AVURLAsset alloc] initWithURL:url options:options];
(I stuck the options in a static so I don't have to keep recreating and disposing it.)
If your assets in documents vs. app bundle are tagged differently, this could be your problem.

Exception error trying to merge videos with AVMutableComposition: "objects cannot be nil"... but they don't appear to be nil?

My brain feels like scrambled eggs... I'm trying to merge video clips together. I have each clip URL stored in an NSMutableArray (arrayClipURL) & that's all good. When I print timeRanges & tracks (both NSMutableArrays) in debug console, everything checks out, which means for loop is doing its job. I keep getting an exception error: '* -[__NSArrayM insertObject:atIndex:]: object cannot be nil'.
I threw in an exception breakpoint and its breaking at the last line below. I cant figure it out because both timeRanges and tracks are NOT nil... I can print them in the debug console and see them just fine directly before the line that breaks.
composition = [[AVMutableComposition composition] init];
composedTrack = [[AVMutableCompositionTrack alloc] init];
composedTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
NSMutableArray * timeRanges = [[NSMutableArray alloc] initWithCapacity:arrayClipURL.count];
NSMutableArray * tracks = [[NSMutableArray alloc] initWithCapacity:arrayClipURL.count];
for (int i=0; i<[arrayClipURL count]; i++){
AVURLAsset *assetClip = [[AVURLAsset alloc] initWithURL:[arrayClipURL objectAtIndex:i] options:nil];
AVAssetTrack *clipTrack = [[AVAssetTrack alloc] init];
clipTrack = [[assetClip tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[timeRanges addObject:[NSValue valueWithCMTimeRange:CMTimeRangeMake(kCMTimeZero, assetClip.duration)]];
[tracks addObject:clipTrack];
}
debug(#"timeranges: %#",timeRanges);
debug(#"tracks: %#",tracks);
[composedTrack insertTimeRanges:timeRanges ofTracks:tracks atTime:kCMTimeZero error:nil];
I am in need of some veteran assistance :( Any idea what could be causing the problem?
Edit:
Here are how the 2 arrays look printed in the console. The only thing I can think of is maybe the CMTimeRange or AVAssetTrack aren;t formatted properly in the arrays??? i have no idea. Just thought it might help to see what it's actually trying to insert when the exception is thrown.
2013-02-18 13:18:20.811 app[5242:907] [AVCaptureManager.m:401] timeranges array: (
"CMTimeRange: {{0/1 = 0.000}, {498/600 = 0.830}}",
"CMTimeRange: {{0/1 = 0.000}, {556/600 = 0.927}}"
)
2013-02-18 13:18:20.812 app[5242:907] [AVCaptureManager.m:402] tracks array: (
"<AVAssetTrack: 0x220c21a0, trackID = 1, mediaType = vide>",
"<AVAssetTrack: 0x1cdec820, trackID = 1, mediaType = vide>"
)
Your problem is caused by the scope of your AVURLAsset instances
Since you AVURLAsset *assetClip in inside the for loop, it is not valid outside of it and neither is the tracks you extracted.
If you keep your assetClip in an array that survives the scope of your for loop, it should fix your problem

How to get iTunes Library top 10 song by playcount within recent 5days?

I'm trying to fetch iTunes library in ios.
And trying to retrieve top 10 playcount song in 5days.
Could you tell me how to do? Here's my code.
MPMediaPropertyPredicate is not right answer...I guess.
MPMediaQuery *everything = [[MPMediaQuery alloc] init];
NSLog(#"Logging items from a generic query...");
NSArray *itemsFromGenericQuery = [everything items];
for (MPMediaItem *song in itemsFromGenericQuery) {
NSString *songTitle = [song valueForProperty: MPMediaItemPropertyTitle];
NSString *artistName = [song valueForProperty:MPMediaItemPropertyArtist];
NSString *lastPlayeddate = [song valueForProperty:MPMediaItemPropertyLastPlayedDate];
NSString *playCount = [song valueForProperty:MPMediaItemPropertyPlayCount];
NSLog (#"%#", songTitle);
text.text = [NSString stringWithFormat:#"%#\n%# %# %# %#", text.text, songTitle, artistName, lastPlayeddate, playCount];
}
All the best.
Apparently, play count is considered a "user defined" key, and therefore cannot be used in MPMediaPropertyPredicate
See http://developer.apple.com/library/ios/documentation/mediaplayer/reference/MPMediaItem_ClassReference/Reference/Reference.html#//apple_ref/doc/uid/TP40008211-CH1-SW38
You will have to iterate the songs and retrieve these properties manually.
enumerateValuesForProperties:usingBlock: is probably your most efficient option here.
http://developer.apple.com/library/ios/documentation/mediaplayer/reference/MPMediaEntity_ClassReference/Reference/Reference.html#//apple_ref/occ/instm/MPMediaEntity/enumerateValuesForProperties:usingBlock:

iPhone play a specific playlist/iMix

I want to play a specific playlist (that was constructed as an iMix) from my program, as long as it exists. I am able to use
[[MPMediaQuery albumsQuery] addFilterPredicate:[MPMediaPropertyPredicate predicateWithValue:#"MyAlbum" forProperty:MPMediaItemPropertyAlbumTitle]];
to get all songs in an album (as well as many other options for artists, etc.) but there seems to be no way to access playlists.
Is there a different way to do this, or will I be forced to store all the songs in the playlist within my code and access them all that way?
I haven't used it myself, but I see a [MPMediaQuery playlistsQuery] and MPMediaGroupingPlaylist in the docs...
Does this link help?
http://discussions.apple.com/thread.jspa?threadID=2084104&tstart=0&messageID=9838244
I ended up having to roll my own via a text file that contains playlist information. here is the code. The [Globals split] function just takes a string and splits it into an array of strings using either a single character ([Globals split: with:]) or each character in a string ([Globals split: withMany:]).
//Create the music player for our application.
musicPlayer = [MPMusicPlayerController applicationMusicPlayer];
[musicPlayer setShuffleMode: MPMusicShuffleModeOff];
[musicPlayer setRepeatMode: MPMusicRepeatModeAll];
//Get our song list from the text file.
NSError *error = nil;
NSString *songList = [NSString stringWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"Playlist" ofType:#"txt"] encoding:NSUTF8StringEncoding error:&error];
//Split it into each song using newlines or carriage returns.
NSArray *allSongs = [Globals split:songList withMany:#"\r\n"];
NSMutableArray *music = [NSMutableArray arrayWithCapacity:[allSongs count]];
for (int i = 0; i < [allSongs count]; i++)
{
//Split the line into tab-delimited info: title, artist, album.
NSArray *songInfo = [Globals split:[allSongs objectAtIndex:i] with:'\t'];
//Get a query using all the data we have. This should return one song.
MPMediaQuery *songQuery = [MPMediaQuery songsQuery];
if ([songInfo count] > 0)
{
[songQuery addFilterPredicate:[MPMediaPropertyPredicate predicateWithValue:[songInfo objectAtIndex:0] forProperty:MPMediaItemPropertyTitle]];
}
if ([songInfo count] > 1)
{
[songQuery addFilterPredicate:[MPMediaPropertyPredicate predicateWithValue:[songInfo objectAtIndex:1] forProperty:MPMediaItemPropertyArtist]];
}
if ([songInfo count] > 2)
{
[songQuery addFilterPredicate:[MPMediaPropertyPredicate predicateWithValue:[songInfo objectAtIndex:2] forProperty:MPMediaItemPropertyAlbumTitle]];
}
//Add the song to our collection if we were able to find it.
NSArray *matching = [songQuery items];
if ([matching count] > 0)
{
[music addObject:[matching objectAtIndex:0]];
printf("Added in: %s\n",[(NSString *)[(MPMediaItem *)[matching objectAtIndex:0] valueForProperty:MPMediaItemPropertyTitle] UTF8String]);
}
else
{
printf("Couldn't add in: %s\n",[(NSString *)[songInfo objectAtIndex:0] UTF8String]);
}
}
//Now that we have a collection, make our playlist.
if ([music count] > 0)
{
itunesLoaded = YES;
// just get the first album with this name (there should only be one)
MPMediaItemCollection *itunesAlbum = [MPMediaItemCollection collectionWithItems:music];
//Shuffle our songs.
musicPlayer.shuffleMode = MPMusicShuffleModeSongs;
[musicPlayer setQueueWithItemCollection: itunesAlbum];
}
The text file is very easily generated using iTunes. All you need to do is create your playlist in iTunes, remove all the song info from your list except for Title, Artist, and Album, select all, and then paste into a text file. It will automatically be tab-delimitted and split by carriage returns. You also won't need to worry about mistyping or anything like that.

Resources