Can I get PHAsset / AVAsset is hdr video/dolby vision on iphone12? - ios

I want to add HDR icon to indicate some asset is HDR, but I can't get any info to check a video if this is a HDR video record from iphone12

+ (BOOL)isHDRVideo:(AVAsset *)avasset {
if (!avasset) {
return NO;
}
__block BOOL isHDRVideo = NO;
[avasset.tracks enumerateObjectsUsingBlock:^(AVAssetTrack * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stopTracks) {
[obj.formatDescriptions enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stopFormatDescriptions) {
CMFormatDescriptionRef desc = (__bridge CMVideoFormatDescriptionRef)obj;
NSDictionary *dic = (__bridge NSDictionary *)CMFormatDescriptionGetExtensions(desc);
NSString *imageBufferColorPrimaries = dic[(__bridge id)kCVImageBufferColorPrimariesKey];
if ([imageBufferColorPrimaries isEqualToString:(__bridge id)kCVImageBufferColorPrimaries_ITU_R_2020]) {
*stopFormatDescriptions = YES;
*stopTracks = YES;
isHDRVideo = YES;
}
}];
}];
return isHDRVideo;
}

A better approach could be using avasset.tracks(withMediaCharacteristic: .containsHDRVideo)
Or
simpleVideo.tracks.forEach{$0.hasMediaCharacteristic(.containsHDRVideo)}

This post really helped my out, so I felt like I had to share back what I've found out to provide a swift based approach for future readers.
One can extend AVAssetTrack to move all the logic where the CMFormatDescription is defined:
public extension AVAssetTrack {
var isHDRVideo: Bool {
guard
self.mediaType == .video, // If is not a video track is not HDR
let cmFormatDescription = self.formatDescriptions.map { $0 as! CMFormatDescription }.first, // Safely get the description
let transferFunction = CMFormatDescriptionGetExtension(
cmFormatDescription,
extensionKey: kCVImageBufferTransferFunctionKey), // It can be nil, which will make the as! CFString fail
else { return false }
return [
kCVImageBufferTransferFunction_ITU_R_2020,
kCVImageBufferTransferFunction_ITU_R_2100_HLG,
kCVImageBufferTransferFunction_SMPTE_ST_2084_PQ
].contains(transferFunction as! CFString)
}
}
Then say you have an AVAsset called asset, you can do:
asset.tracks(withMediaType: .video).map { $0.isHDRVideo }
and build your logic on that.

Related

convert from CGImageMetadata to nsdictionary

i tried to convert my 360° metadata image to nsdictionary. every time the app crash when i tried to print a value for an attribute. i wrote this code
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
NSArray *metadataArray = nil;
if (source) {
_metaData = CGImageSourceCopyMetadataAtIndex(source, 0, NULL);
if (_metaData) {
metadataArray = CFBridgingRelease(CGImageMetadataCopyTags(_metaData));
CFRelease(_metaData);
}
CFRelease(source);
}
NSLog(#"%#",metadataArray[12]);//this is the problem
In the output, i find that the type of immutableMetadata is CGImageMetadata not a NSdictionary. How i can convert to nsdictionary please??
Swift
extension CGImageSource {
func metadata() -> Dictionary<String, Any>? {
guard let imageMetadata = CGImageSourceCopyMetadataAtIndex(self, 0, .none) else {
return .none
}
guard let tags = CGImageMetadataCopyTags(imageMetadata) else {
return .none
}
var result = [String: Any]()
for tag in tags as NSArray {
let tagMetadata = tag as! CGImageMetadataTag
if let cfName = CGImageMetadataTagCopyName(tagMetadata) {
let name = String(cfName)
let value = CGImageMetadataTagCopyValue(tagMetadata)
result[name] = value
}
}
return result
}
}
I don't know much about ImageIO. It's full of CF/CG stuff, which may lead to __bridge uses.
You are only interested in a value, but if someone is looking for another, it could use that sample code/research:
I looked into CGImageMetadata.h for each use of functions related to CGImageMetadataTagRef that may be useful.
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)imagedata, NULL);
CGImageMetadataRef imageMetadataRef = CGImageSourceCopyMetadataAtIndex(source, 0, NULL);
NSArray *metadata = CFBridgingRelease(CGImageMetadataCopyTags(imageMetadataRef));
NSLog(#"metadata = %#", metadata);
for (id aRef in metadata)
{
CGImageMetadataTagRef currentRef = (__bridge CGImageMetadataTagRef)(aRef);
NSLog(#"Ref: %#", currentRef);
CGImageMetadataType type = CGImageMetadataTagGetType(currentRef);
NSLog(#"Type: %d", type);
CFStringRef nameSpace = CGImageMetadataTagCopyNamespace(currentRef);
NSLog(#"NameSpace: %#", nameSpace);
CFStringRef prefix = CGImageMetadataTagCopyPrefix(currentRef);
NSLog(#"prefix: %#", prefix);
CFStringRef name = CGImageMetadataTagCopyName(currentRef);
NSLog(#"name: %#", name);
CFTypeRef value = CGImageMetadataTagCopyValue(currentRef);
NSLog(#"Value: %#", value);
CFStringRef valueTypeStr = CFCopyTypeIDDescription(CFGetTypeID(value));
NSLog(#"valueTypeStr: %#", valueTypeStr);
//Test related to question
if ([#"GPano" isEqualToString:(__bridge NSString *)prefix] && [#"PoseHeadingDegrees" isEqualToString:(__bridge NSString *)name])
{
NSString *str = (__bridge NSString *)value;
NSLog(#"Str: %#", str);
NSLog(#"Int: %d", [str intValue]);
}
}
To know if the value is convertible to a more "known" object, like NSString (<=> CFString), NSArray (<=> CFArray), NSDictionary (<=> CFDictionary), I looked at CFStringRef valueTypeStr = CFCopyTypeIDDescription(CFGetTypeID(value)); (or you can also do a "quicker test" with related question: Determining what a CFTypeRef is?).
It's all "manually" done, I'm quite surprised that there is no conversion from CGImageMetadataTagRef to a more "known" object (and not struct), maybe there is, I didn't read the whole headers of CGImageMetadata.h, so there may be a quicker way, but I thought that even using "old school" approach, by digging could be useful for another user with a related question, helping him/her to find quickly what's he/she is looking for.
To answer your initial question: CGImageMetadata to NSDictionary, if there is no "easy conversion" tool, and with the "digging" approach, it's won't make a lot of sense, because the interesting method CGImageMetadataCopyTags returns a NSArray (or CFArray).
But since a lot of CGImageMetadataTagRef prefix are the same, you may be interested, by doing manually the following dictionary:
#{prefix1:#[stuff10, stuff11, etc.], prefix2:#[stuff20, stuff21, etc.]}

Caching Images in UICollectionViewCell Swift

I am trying to cache images to prevent it from reloading constantly and crashing the app. So, I went to look at apple's implementation of ImageCaching which was written in Objective-C and had to replicate in swift. But, when I run and I try to cache, my app cache with an error of fatal error: found nil while unwrapping an optional value
Swift Code
func updateCachedAssets() -> Void {
let isViewVisible: Bool = self.isViewLoaded() && self.view.window != nil
if !isViewVisible { return }
//The preheat window is twice the height of the visible rect
var preheatRect: CGRect = (self.collectionView?.bounds)!
preheatRect = CGRectInset(preheatRect, 0.0, -0.5 * CGRectGetHeight(preheatRect))
//Check if the collection view is showing an area that is significantly diferent to the last preheated area
let delta: CGFloat = abs(CGRectGetMidY(preheatRect) - CGRectGetMidY(self.prevoiusPreheatRect))
if delta > CGRectGetHeight((self.collectionView?.bounds)!)/3.0 {
//Compute the assets to start caching and to stop caching
let addedIndexPaths = NSMutableArray()
let removedIndexPaths = NSMutableArray()
self.computeDifferenceBetweenRect(self.prevoiusPreheatRect, andRect: preheatRect, removedHandler: { (removedRect) -> Void in
let indexPaths: NSArray = [self.collectionView! .aapl_indexPathsForElementsInRect(removedRect)]
removedIndexPaths.addObjectsFromArray(indexPaths as [AnyObject])
}, addedHandler: { (addedRect) -> Void in
let indexPaths: NSArray = [self.collectionView! .aapl_indexPathsForElementsInRect(addedRect)]
addedIndexPaths.addObjectsFromArray(indexPaths as [AnyObject])
})
//print("AssetAtIndex", self.assetsAtIndexPaths(addedIndexPaths))
let assetsToStartCaching: NSArray = self.assetsAtIndexPaths(addedIndexPaths)
let assetsToStopCaching: NSArray = self.assetsAtIndexPaths(removedIndexPaths)
//Update the assets the PHCachingImageManager is caching.
self.imageManager.startCachingImagesForAssets(assetsToStartCaching as! [PHAsset], targetSize: AssetGridThumbnailSize, contentMode: .AspectFill, options: nil)
self.imageManager.startCachingImagesForAssets(assetsToStopCaching as! [PHAsset], targetSize: AssetGridThumbnailSize, contentMode: .AspectFill, options: nil)
self.prevoiusPreheatRect = preheatRect
}
}
Objective-C
- (void)updateCachedAssets {
BOOL isViewVisible = [self isViewLoaded] && [[self view] window] != nil;
if (!isViewVisible) { return; }
// The preheat window is twice the height of the visible rect.
CGRect preheatRect = self.collectionView.bounds;
preheatRect = CGRectInset(preheatRect, 0.0f, -0.5f * CGRectGetHeight(preheatRect));
/*
Check if the collection view is showing an area that is significantly
different to the last preheated area.
*/
CGFloat delta = ABS(CGRectGetMidY(preheatRect) - CGRectGetMidY(self.previousPreheatRect));
if (delta > CGRectGetHeight(self.collectionView.bounds) / 3.0f) {
// Compute the assets to start caching and to stop caching.
NSMutableArray *addedIndexPaths = [NSMutableArray array];
NSMutableArray *removedIndexPaths = [NSMutableArray array];
[self computeDifferenceBetweenRect:self.previousPreheatRect andRect:preheatRect removedHandler:^(CGRect removedRect) {
NSArray *indexPaths = [self.collectionView aapl_indexPathsForElementsInRect:removedRect];
[removedIndexPaths addObjectsFromArray:indexPaths];
} addedHandler:^(CGRect addedRect) {
NSArray *indexPaths = [self.collectionView aapl_indexPathsForElementsInRect:addedRect];
[addedIndexPaths addObjectsFromArray:indexPaths];
}];
NSArray *assetsToStartCaching = [self assetsAtIndexPaths:addedIndexPaths];
NSArray *assetsToStopCaching = [self assetsAtIndexPaths:removedIndexPaths];
// Update the assets the PHCachingImageManager is caching.
[self.imageManager startCachingImagesForAssets:assetsToStartCaching
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:nil];
[self.imageManager stopCachingImagesForAssets:assetsToStopCaching
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:nil];
// Store the preheat rect to compare against in the future.
self.previousPreheatRect = preheatRect;
}
}
The last ran code before crashing as shown by the debugger
Swift
func assetsAtIndexPaths(indexPaths: NSArray) -> NSArray {
if indexPaths.count == 0 {return []}
let assets = NSMutableArray(capacity: indexPaths.count)
for indexPath in indexPaths {
let asset = self.assetsFetchResults[indexPath.item] as! PHAsset
assets.addObject(asset)
}
return assets
}
Objective-C
- (NSArray *)assetsAtIndexPaths:(NSArray *)indexPaths {
if (indexPaths.count == 0) { return nil; }
NSMutableArray *assets = [NSMutableArray arrayWithCapacity:indexPaths.count];
for (NSIndexPath *indexPath in indexPaths) {
PHAsset *asset = self.assetsFetchResults[indexPath.item];
[assets addObject:asset];
}
return assets;
}
Error
fatal error: unexpectedly found nil while unwrapping an Optional value
The last line before throwing the error. Which is not an optional value is
let asset = self.assetsFetchResults[indexPath.item] as! PHAsset which is in assetsAtIndexPaths method
Any help would be appreciated. Thanks
You should unwrap optional value.
If the crash is due to an undefined value in the Optional, the following will protect against that condition
if let asset = self.assetsFetchResults[indexPath.item] as? PHAsset {
assets.addObject(asset)
}
if you replace this code with it inside your for loop
let asset = self.assetsFetchResults[indexPath.item] as! PHAsset
assets.addObject(asset)
This is called Optional Binding as found in the Swift book:
You use optional binding to find out whether an optional contains a value, and if so, to make that value available as a temporary constant or variable. Optional binding can be used with if and while statements to check for a value inside an optional, and to extract that value into a constant or variable, as part of a single action. if and while statements are described in more detail in Control Flow.

Check if PHAsset exists in PHFetchResult

how to know if asset for local identifier is not found. I have the list of localIDs of each photos and videos been fetched from the photos framework, how to know if the photo is present or not in the iOS photo album.
You need to keep track on the number of assets in that userAlbums and if you didn't find the asset until the last asset is checked return the Not Found notification.
You can do it like:
NSString *localId = /*local identifier of photo */;
PHFetchResult *userAlbums = [PHAsset fetchAssetsWithLocalIdentifiers:#[localId] options:nil];
NSUInteger assetCount = [userAlbums count];
__block NSUInteger assetCounter = 0;
[userAlbums enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop)
{
// Check whether you found the asset or not
if (/*asset found*/)
{
// Asset found, set the stop bool to yes
}
else if (assetCounter == assetCount)
{
//Data not found yet
}
}];
Why not just use:
if ([userAlbums count]) {
//At least one item found.
}
else {
//Nothing found
}

How to get only images in the camera roll using Photos Framework

The following code loads images that are also located on iCloud or the streams images. How can we limit the search to only images in the camera roll?
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: nil)
After adding the Camera Roll and Photo Stream albums, Apple added the following PHAssetCollectionSubtype types in iOS 8.1:
PHAssetCollectionSubtypeAlbumMyPhotoStream (together with PHAssetCollectionTypeAlbum) - fetches the Photo Stream album.
PHAssetCollectionSubtypeSmartAlbumUserLibrary (together with PHAssetCollectionTypeSmartAlbum) - fetches the Camera Roll album.
Haven't tested if this is backward-compatible with iOS 8.0.x though.
Through some experimentation we discovered a hidden property not listed in the documentation (assetSource). Basically you have to do a regular fetch request, then use a predicate to filter the ones from the camera roll. This value should be 3.
Sample code:
//fetch all assets, then sub fetch only the range we need
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)
assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in
results.addObject(obj)
}
var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %#", argumentArray: [3]))
results = NSMutableArray(array: cameraRollAssets)
If you are searching like me for Objective C code, and also you didn't get Answer of new library/ Photo Framework as you were getting deprecated AssetsLibrary's code , Then this will help you:
Swift
Global Variables:
func getAllPhotosFromCameraRoll() -> [UIImage] {
// TODO: Add `NSPhotoLibraryUsageDescription` to info.plist
PHPhotoLibrary.requestAuthorization { print($0) } // TODO: Move this line of code to somewhere before attempting to access photos
var images = [UIImage]()
let requestOptions: PHImageRequestOptions = PHImageRequestOptions()
requestOptions.resizeMode = .exact
requestOptions.deliveryMode = .highQualityFormat
requestOptions.isSynchronous = true
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: .image, options: nil)
let manager: PHImageManager = PHImageManager.default()
for i in 0..<fetchResult.count {
let asset = fetchResult.object(at: i)
manager.requestImage(
for: asset,
targetSize: PHImageManagerMaximumSize,
contentMode: .default,
options: requestOptions,
resultHandler: { (image: UIImage?, info: [AnyHashable: Any]?) -> Void in
if let image = image {
images.append(image)
}
})
}
return images
}
Objective C
Global Variables:
NSArray *imageArray;
NSMutableArray *mutableArray;
below method will help you:
-(void)getAllPhotosFromCamera
{
imageArray=[[NSArray alloc] init];
mutableArray =[[NSMutableArray alloc]init];
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
requestOptions.synchronous = true;
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
PHImageManager *manager = [PHImageManager defaultManager];
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[result count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in result) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[images addObject:ima];
}];
}
imageArray = [images copy]; // You can direct use NSMutuable Array images
}
If you use your own PHCachingImageManager instead of the shared PHImageManager instance then when you call requestImageForAsset:targetSize:contentMode:options:resultHandler: you can set an option in PHImageRequestOptions to specify that the image is local.
networkAccessAllowed
Property
A Boolean value that specifies whether Photos can download the requested image from iCloud.
networkAccessAllowed
Discussion
If YES, and the requested image is not stored on the local device, Photos downloads the image from iCloud. To be notified of the download’s progress, use the progressHandler property to provide a block that Photos calls periodically while downloading the image. If NO (the default), and the image is not on the local device, the PHImageResultIsInCloudKey value in the result handler’s info dictionary indicates that the image is not available unless you enable network access.
This can help. You can use your own data model instead of AlbumModel I used.
func getCameraRoll() -> AlbumModel {
var cameraRollAlbum : AlbumModel!
let cameraRoll = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil)
cameraRoll.enumerateObjects({ (object: AnyObject!, count: Int, stop: UnsafeMutablePointer) in
if object is PHAssetCollection {
let obj:PHAssetCollection = object as! PHAssetCollection
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let assets = PHAsset.fetchAssets(in: obj, options: fetchOptions)
if assets.count > 0 {
let newAlbum = AlbumModel(name: obj.localizedTitle!, count: assets.count, collection:obj, assets: assets)
cameraRollAlbum = newAlbum
}
}
})
return cameraRollAlbum
}
Here is Objective- c version provided by apple.
-(NSMutableArray *)getNumberOfPhotoFromCameraRoll:(NSArray *)array{
PHFetchResult *fetchResult = array[1];
int index = 0;
unsigned long pictures = 0;
for(int i = 0; i < fetchResult.count; i++){
unsigned long temp = 0;
temp = [PHAsset fetchAssetsInAssetCollection:fetchResult[i] options:nil].count;
if(temp > pictures ){
pictures = temp;
index = i;
}
}
PHCollection *collection = fetchResult[index];
if (![collection isKindOfClass:[PHAssetCollection class]]) {
// return;
}
// Configure the AAPLAssetGridViewController with the asset collection.
PHAssetCollection *assetCollection = (PHAssetCollection *)collection;
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
self. assetsFetchResults = assetsFetchResult;
self. assetCollection = assetCollection;
self.numberOfPhotoArray = [NSMutableArray array];
for (int i = 0; i<[assetsFetchResult count]; i++) {
PHAsset *asset = assetsFetchResult[i];
[self.numberOfPhotoArray addObject:asset];
}
NSLog(#"%lu",(unsigned long)[self.numberOfPhotoArray count]);
return self.numberOfPhotoArray;
}
Where you can grab following details
PHFetchResult *fetchResult = self.sectionFetchResults[1];
PHCollection *collection = fetchResult[6];
**value 1,6 used to get camera images**
**value 1,0 used to get screen shots**
**value 1,1 used to get hidden**
**value 1,2 used to get selfies**
**value 1,3 used to get recently added**
**value 1,4 used to get videos**
**value 1,5 used to get recently deleted**
**value 1,7 used to get favorites**
Apple demo link
Declare your property
#property (nonatomic, strong) NSArray *sectionFetchResults;
#property (nonatomic, strong) PHFetchResult *assetsFetchResults;
#property (nonatomic, strong) PHAssetCollection *assetCollection;
#property (nonatomic, strong) NSMutableArray *numberOfPhotoArray;
I've been banging my head over this too. I've found no way to filter for only assets on the device with fetchAssetsWithMediaType or fetchAssetsInAssetCollection. I'm able to use requestContentEditingInputWithOptions or requestImageDataForAsset to determine if the asset is on the device or not, but this is asynchronous and seems like it's using way too much resources to do for every asset in the list. There must be a better way.
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
for (int i=0; i<[fetchResult count]; i++) {
PHAsset *asset = fetchResult[i];
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
if ([[info objectForKey:PHContentEditingInputResultIsInCloudKey] intValue] == 1) {
NSLog(#"asset is in cloud");
} else {
NSLog(#"asset is on device");
}
}];
}
If you don't want to rely on an undocumented API, look at [asset canPerformEditOperation:PHAssetEditOperationContent]. This only returns true if the full original is available on device.
Admittedly this is also fragile, but testing shows it works for all of the assetSource types (photostream, iTunes sync, etc).

Avmetadataitem Getting the trackNumber from an iTunes or ID3 metadata on iOS

I'm trying to get the trackNumber of a aac, mp3 or mp4 file. It's not in the commonMetadata so I started to spelunk in the other metadata keys. I found something that looks like it, but I'm yet unable to read it, and make sense of it. Even the raw data makes no sense to me.
At the moment, I'm just trying to get it using this basic code:
NSArray *meta = [asset metadataForFormat:AVMetadataFormatiTunesMetadata];
for ( AVMetadataItem* item in meta ) {
id key = [item key];
NSString *value = [item stringValue];
NSLog(#"key = %#, value = %#", key, value);
}
Knowing I'm looking for AVMetadataiTunesMetadataKeyTrackNumber.
I realize this thread is quite old but I recently came across this issue myself. I needed ALL the metadata I could gather and came up with the following solution. It's not the most elegant solution but it works well enough. Written in Swift.
func processFile(url:NSURL) {
let yearKey = -1453039239
let genreKey = -1452841618
let encoderKey = -1451987089
let trackKey = "com.apple.iTunes.iTunes_CDDB_TrackNumber"
let CDDBKey = "com.apple.iTunes.iTunes_CDDB_1"
let path:String = url.absoluteString
let asset = AVURLAsset(URL: url, options: nil)
let format = AVMetadataFormatiTunesMetadata
for item:AVMetadataItem in asset.metadataForFormat(format) as Array<AVMetadataItem> {
if let key = item.commonKey { if key == "title" { println(item.value()) } }
if let key = item.commonKey { if key == "artist" { println(item.value()) } }
if let key = item.commonKey { if key == "albumName" { println(item.value()) } }
if let key = item.commonKey { if key == "creationDate" { println(item.value()) } }
if let key = item.commonKey { if key == "artwork" { println( "art" ) } }
if item.key().isKindOfClass(NSNumber) {
if item.key() as NSNumber == yearKey { println("year: \(item.numberValue)") }
if item.key() as NSNumber == genreKey { println("genre: \(item.stringValue)") }
if item.key() as NSNumber == encoderKey { println("encoder: \(item.stringValue)") }
}
if item.key().isKindOfClass(NSString) {
if item.key() as String == trackKey { println("track: \(item.stringValue)") }
if item.key() as String == CDDBKey { println("CDDB: \(item.stringValue)") }
}
}
}
If your track has ID3 metadata, you can easily get the numberValue for the track number. If your track has iTunesMetadata, the dataValue is all you get. You have to guess the intValue yourself.
So far, I'm here. I'm pretty sure I need to work more on the bytes portion.
NSArray *meta = [asset metadataForFormat:AVMetadataFormatiTunesMetadata];
NSArray *itfilteredKeys = [AVMetadataItem metadataItemsFromArray:meta withKey:AVMetadataiTunesMetadataKeyTrackNumber keySpace:nil];
for ( AVMetadataItem* item in itfilteredKeys ) {
NSData *value = [item dataValue];
unsigned char aBuffer[4];
[value getBytes:aBuffer length:4];
unsigned char *n = [value bytes];
int value1 = aBuffer[3];
NSLog(#"trackNumber from iTunes = %i", value1);
}
This question is rather old, but I came upon it as I had a similar problem, so for anyone who still needs a solution. I managed to figure out a way of getting the total tracks number and the track number using the following code:
NSString *value = #"<0000000a 00140000>" //[item stringValue] (in this case 10/20)
NSString *track_no_hex = [[value componentsSeparatedByString:#" "][0] stringByReplacingOccurrencesOfString:#"<" withString:#""];
NSString *total_track_no_hex = [[value componentsSeparatedByString:#" "][1] stringByReplacingOccurrencesOfString:#">" withString:#""];
NSString *track_no = #"";
for(int k=0;k<=4;k+=4){
unsigned result = 0;
NSScanner *scanner = [NSScanner scannerWithString:[track_no_hex substringWithRange:NSMakeRange(k, 4)]];
[scanner scanHexInt:&result];
if(result == 00){
}
else{
track_no = [NSString stringWithFormat:#"%#%u",track_no,result];
}
}
NSString *total_track_no = #"";
for(int k=0;k<=4;k+=4){
unsigned result = 0;
NSScanner *scanner;
if(k+4<=[total_track_no_hex length]){
scanner = [NSScanner scannerWithString:[total_track_no_hex substringWithRange:NSMakeRange(k, 4)]];
}
else{
}
[scanner scanHexInt:&result];
if(result == 00){
}
else{
total_track_no = [NSString stringWithFormat:#"%#%u",total_track_no,result];
}
}
NSLog(#"%#/%#",track_no, total_track_no); // Output 10/20
This will work fine for track numbers under 14461 which should be large enough considering iTunes max track number is 999.

Resources