Getting metadata from an audio stream - ios

I would like to get the file name and, if possible, album image from a streaming URL in a AVPlayerItem that I am playing with AVQueuePlayer but I don't know how to go about doing this.
Also if it turns out that my streaming URL doesn't have any metadata can I put metadata in my NSURL* before passing it to the AVPlayerItem?
Thanks.

Well I am surprised no one has answered this question.
In fact no one has answered any of my other questions.
Makes me wonder how much knowledge people in here truly have.
Anyways, I will go ahead and answer my own question.
I found out how to get the metadata by doing the following:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:url];
NSArray *metadataList = [playerItem.asset commonMetadata];
for (AVMetadataItem *metaItem in metadataList) {
NSLog(#"%#",[metaItem commonKey]);
}
Which gives me a list as follows:
title
creationDate
artwork
albumName
artist
With that list now I know how to access the metadata from my audio stream. Just simply go through the NSArray and look for an AVMetadataItem that has the commonKey that I want (for example, title). Then when I find the AVMetadataItem just get the value property from it.
Now, this works great but it may be possible that when you try to get the data it will take a while. You can load the data asynchronously by sending loadValuesAsynchronouslyForKeys:completionHandler: to the AVMetadataItem you just found.
Hope that helps to anyone who may find themselves with the same problem.

When retrieving a particular item I would use the Metadata common keys constant declared in AVMetadataFormat.h, i.e.: AVMetadataCommonKeyTitle.
NSUInteger titleIndex = [avItem.asset.commonMetadata indexOfObjectPassingTest:^BOOL(id obj, NSUInteger idx, BOOL *stop) {
AVMutableMetadataItem *metaItem = (AVMutableMetadataItem *)obj;
if ([metaItem.commonKey isEqualToString:AVMetadataCommonKeyTitle]) {
return YES;
}
return NO;
}];
AVMutableMetadataItem *item = [avItem.asset.commonMetadata objectAtIndex:titleIndex];
NSString *title = (NSString *)item.value;

Related

Objective C: Getting MPMediaItem 'Favorite' property

I was looking at Apple's documentation, and I cannot seem to find a way to get whether or not an MPMediaItem is a 'favorite' track or not. See screenshot below, with the pink heart.
How can one get this property? I know since it's a new feature, it's availability would be limited to iOS 8.4 or later.
Here's some code I'm using to get other properties from MPMediaItems, via the music picker:
- (void) processMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
//iterate through selected songs
if (mediaItemCollection) {
NSArray *allSelectedSongs = [mediaItemCollection items];
for(MPMediaItem *song in allSelectedSongs)
{
NSURL *songURL = [song valueForProperty:MPMediaItemPropertyAssetURL];
NSNumber *ident = [song valueForProperty:MPMediaEntityPropertyPersistentID];
NSString *identString = [BukketHelper convertULLToNSString:ident];
NSNumber *isCloud = [song valueForProperty:MPMediaItemPropertyIsCloudItem];
}
//do other stuff here
}
Anyone have ideas?
You have to use Apple Music API to get or set users's Like/Dislike to a song like this:
GET https://api.music.apple.com/v1/me/ratings/songs/{id}
From: Apple Docs link

CoreSpotlight indexing

Hi I'm trying to implement CoreSpotlight in my app.
When indexing do I need to run this every time or is it sufficient to run this once when app is installed for the first time?
If app is deleted do I need to index again?
Here's the code I'm using:
- (void)spotLightIndexing {
NSString *path = [[NSBundle mainBundle] pathForResource:
#"aDetailed" ofType:#"plist"];
NSDictionary *plistDict = [[NSDictionary alloc] initWithContentsOfFile:path];
NSArray *plistArray = [plistDict allKeys];
for (id key in plistDict) {
CSSearchableItemAttributeSet* attributeSet = [[CSSearchableItemAttributeSet alloc] initWithItemContentType:(NSString *)kUTTypeImage];
// Set properties that describe attributes of the item such as title, description, and image.
attributeSet.title = key;
attributeSet.contentDescription = [plistDict objectForKey:key];
//*************************************
attributeSet.keywords = plistArray; // Another Q: do i need this????
//**************************************
// Create an attribute set for an item
UIImage *image = [UIImage imageNamed:#"icon.png"];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
attributeSet.thumbnailData = imageData;
// Create a searchable item, specifying its ID, associated domain, and the attribute set you created earlier.
CSSearchableItem *item;
NSString *identifier = [NSString stringWithFormat:#"%#",attributeSet.title];
item = [[CSSearchableItem alloc] initWithUniqueIdentifier:identifier domainIdentifier:#"com.example.apple_sample.theapp.search" attributeSet:attributeSet];
// Index the item.
[[CSSearchableIndex defaultSearchableIndex] indexSearchableItems:#[item] completionHandler: ^(NSError * __nullable error) {
if (!error)
NSLog(#"Search item indexed");
else {
NSLog(#"******************* E R R O R *********************");
}];
}
}
thank you
Its indexed as specified. So if you put your spotLightIndexing method in didFinishLaunchingWithOptions it will of naturally index items every launch, unless you set a bool of course. If the app is deleted it will re-index again as the NSUserDefault values will be zeroed out. That is why they offer you add/altering/updating indices via batch updates or other methods as annotated here
Since your populating it from a local plist as opposed to the web, you will have to do the updates yourself or create an index-maintenance app extension.
If you watch the WWDC video on this topic, you will see that's easy to update or delete domains by a 'group' using the domain identifier. Source It's a good watch.
As far as the keywords, there is no telling until the documents are fully supporting iOS9 APIs. But just by reading what Apple has publicly provided here is a note you should consider :
Important: Be sure to avoid over-indexing your app content or adding unrelated keywords and attributes in an attempt to improve the ranking of your results. Because iOS measures the level of user engagement with search results, items that users don’t find useful are quickly identified and can eventually stop showing up in results.
That is located after the new Search features summary. And it goes on to say why:
When you combine multiple Search APIs, items can get indexed from multiple places. To avoid giving users duplicate items in search results, you need to link item IDs appropriately. To ensure that item IDs are linked, you can use the same value in a searchable item’s uniqueIdentifier property and in the relatedUniqueIdentifier property within an NSUserActivity object’s contentAttributes property
So in other words, say you incorporate NSUserActivity as they intend you to because it can apply to all users of your app, not just the person doing the querying, it can populate multiple times in the same search. So, based on Apples suggestions, try not to use keywords unless your sure, especially based off your example, where the keyword already = uniqueIdentifier.
Personally, i've already implemented this into my app and love it, however, I use web mark-up which makes batch updates almost instantaneously, as opposed to your route, where you would have to actually push out a new update to re-update/delete the indices.

iOS Photos framework unable to get image data / UTI

When requesting images from the Photos Framework I manage to get all but the last 64 correctly. The last ones always return nil for the dataUTI and imageData in the following code. Whilst attempting to figure out what was going on I found that the PHAsset knows exactly what the UTI is, but is reporting it to me as nil.
Anyone else seen this?
You can see I've made my code access the asset's UTI when it's reported as nil so that my app can determine if it's a gif or not but this isn't an advisable way of doing it and I never get the imageData anyway so it's not a huge amount of help!
PHFetchOptions* fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult *allPhotosResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options: fetchOptions];
[allPhotosResult enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
PHImageRequestOptions* options = [[PHImageRequestOptions alloc] init];
options.synchronous = NO;
options.networkAccessAllowed = YES;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
[[PHImageManager defaultManager] requestImageDataForAsset: asset options: options resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString* val = [asset valueForKey: #"uniformTypeIdentifier"];
if( !dataUTI )
{
dataUTI = val;
}
}];
}];
EDIT:
I forgot to mention that the missing image creation dates aren't the most recent images and seem spread out. Actually, even the Photos app doesn't seem to show them, based on their creation date. But there doesn't seem to be anything that should be in that their positions looking at the neighboring images of where their creation dates would place them.
Not much of an answer here so happy for someone else to take a bash at explaining it!
Looking at the creation dates of the missing assets I managed to track one down in the Photos app that was missing from my app. It had a thumbnail but when I selected it it did the circular download indicator to pull down the data but then trying to open it in my app's Action Extension (just let's you preview the gif's animation in the Photos app or elsewhere) a popup appeared that said there was an error preparing it. Which I've not seen before but clearly something was going wonky with iCloud.
Previously I was requesting the PHImageRequestOptionsVersionUnadjusted in my app but switching it to PHImageRequestOptionsVersionOriginal seems to have fixed it....?

Get meta data displayed in MPNowPlayingInfoCenter's nowPlayingInfo(lock screen and remote control)

Thanks for noticing this question. I want to do something about music recommendation, and what I am doing now is leveraging MPNowPlayingInfoCenter's nowPlayingInfo, like this:
NSDictionary *metaData = [[MPNowPlayingInfoCenter defaultCenter] nowPlayingInfo];
NSString *songTitle = metaData[MPMediaItemPropertyTitle];
NSString *albumnTitle = metaData[MPMediaItemPropertyAlbumTitle];
NSString *artist = metaData[MPMediaItemPropertyArtist];
But it always returns nil when "Music" app is playing music in background.
I looked up the related documents, it says
MPNowPlayingInfoCenter provides an interface for setting the current now
playing information for the application.
The default center holds now playing info about the current application
Seems there is no way to get other app's nowPlayingInfo through MPNowPlayingInfoCenter. So are there any other ways to get other app's music meta data displayed in remote control/lock screen? Thanks!
You can get what iPod currently is playing
MPMusicPlayerController* player = [MPMusicPlayerController iPodMusicPlayer];
//get now playing item
MPMediaItem*item = [player nowPlayingItem];
// get the title of song
NSString* titleStr = [item valueForProperty:MPMediaItemPropertyTitle];
NSLog(#"titlestr %#",titleStr);

ALAsset "real" filename

I have transfered an MP4 from iTunes to my iPad. If I open the Videos app I can see it listed - there's the thumbnail and the filename below (my_file.mp4). However the actual filename of the asset is changed in iOS to some unique value - IMG_001.MOV, for example. I would like to get the the original filename as it is listed in the Videos app (my_file.mp4). How are where do I find this?
Thanks
Here is a Code to get the Real name of AssetsFile
ALAssetsGroupEnumerationResultsBlock assetsEnumerationBlock = ^(ALAsset *result, NSUInteger index, BOOL *stop) {
if (result) {
[self.arrAssetsMedia addObject:result];
}
ALAssetRepresentation *rep = [result defaultRepresentation];
if (rep.filename!=nil) {
NSLog(#"File name is::%#",rep.filename);
[arrMediaFileName addObject:rep.filename];
}
};
Note:ALAssetRepresentation has a property - (NSString *)filename with the help of this we can get the file name
You Can't get the MP4 file from your device with its original name.
The only way to retrieve the PhotoGallery item (photos/videos) from your device to your application is through ALAssetLibrary.
The items will have names such as assets-library://asset/asset.MP4?id=1000000001&ext=MP4
So you can track the URL and get the MP4 using ALAsset Library.

Resources