Check if PHAsset exists in PHFetchResult - ios

how to know if asset for local identifier is not found. I have the list of localIDs of each photos and videos been fetched from the photos framework, how to know if the photo is present or not in the iOS photo album.

You need to keep track on the number of assets in that userAlbums and if you didn't find the asset until the last asset is checked return the Not Found notification.
You can do it like:
NSString *localId = /*local identifier of photo */;
PHFetchResult *userAlbums = [PHAsset fetchAssetsWithLocalIdentifiers:#[localId] options:nil];
NSUInteger assetCount = [userAlbums count];
__block NSUInteger assetCounter = 0;
[userAlbums enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop)
{
// Check whether you found the asset or not
if (/*asset found*/)
{
// Asset found, set the stop bool to yes
}
else if (assetCounter == assetCount)
{
//Data not found yet
}
}];

Why not just use:
if ([userAlbums count]) {
//At least one item found.
}
else {
//Nothing found
}

Related

Can I get PHAsset / AVAsset is hdr video/dolby vision on iphone12?

I want to add HDR icon to indicate some asset is HDR, but I can't get any info to check a video if this is a HDR video record from iphone12
+ (BOOL)isHDRVideo:(AVAsset *)avasset {
if (!avasset) {
return NO;
}
__block BOOL isHDRVideo = NO;
[avasset.tracks enumerateObjectsUsingBlock:^(AVAssetTrack * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stopTracks) {
[obj.formatDescriptions enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stopFormatDescriptions) {
CMFormatDescriptionRef desc = (__bridge CMVideoFormatDescriptionRef)obj;
NSDictionary *dic = (__bridge NSDictionary *)CMFormatDescriptionGetExtensions(desc);
NSString *imageBufferColorPrimaries = dic[(__bridge id)kCVImageBufferColorPrimariesKey];
if ([imageBufferColorPrimaries isEqualToString:(__bridge id)kCVImageBufferColorPrimaries_ITU_R_2020]) {
*stopFormatDescriptions = YES;
*stopTracks = YES;
isHDRVideo = YES;
}
}];
}];
return isHDRVideo;
}
A better approach could be using avasset.tracks(withMediaCharacteristic: .containsHDRVideo)
Or
simpleVideo.tracks.forEach{$0.hasMediaCharacteristic(.containsHDRVideo)}
This post really helped my out, so I felt like I had to share back what I've found out to provide a swift based approach for future readers.
One can extend AVAssetTrack to move all the logic where the CMFormatDescription is defined:
public extension AVAssetTrack {
var isHDRVideo: Bool {
guard
self.mediaType == .video, // If is not a video track is not HDR
let cmFormatDescription = self.formatDescriptions.map { $0 as! CMFormatDescription }.first, // Safely get the description
let transferFunction = CMFormatDescriptionGetExtension(
cmFormatDescription,
extensionKey: kCVImageBufferTransferFunctionKey), // It can be nil, which will make the as! CFString fail
else { return false }
return [
kCVImageBufferTransferFunction_ITU_R_2020,
kCVImageBufferTransferFunction_ITU_R_2100_HLG,
kCVImageBufferTransferFunction_SMPTE_ST_2084_PQ
].contains(transferFunction as! CFString)
}
}
Then say you have an AVAsset called asset, you can do:
asset.tracks(withMediaType: .video).map { $0.isHDRVideo }
and build your logic on that.

PHAsset Null but local identifier is valid

I am trying to load an asset from a local identifier. The local identifier seems to be correct but the asset is null and I can't figure out why. I have a similar code in another part of my app that works fine.
Queue *queue = [NSEntityDescription insertNewObjectForEntityForName:#"Queue" inManagedObjectContext:self.context];
for (int reyrt = 0; reyrt < self.storeGIF.count; reyrt++) {
queue.queuetextimagePath = [self.storeGIF objectAtIndex:reyrt];
}
__block float fileSize;
NSArray *identifiers = #[queue.queuetextimagePath];
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsWithLocalIdentifiers:identifiers options:nil];
PHAsset *asset = [assetsFetchResult firstObject];
NSLog (#"identifiers: %#", identifiers);
NSLog (#"assetsFetchResult: %#", assetsFetchResult);
NSLog (#"asset: %#", asset);
if (!asset) {
NSLog(#"can't retrieve PHAsset from localIdentifier:%#",identifiers);
}
Here is the NSLog file result when using the above code.
019-07-19 09:26:36.366739-0400 myApp[1440:328144] identifiers: (
"857AC3DA-C047-4D88-911B-C5FE227E2B96/L0/001"
)
2019-07-19 09:26:36.366945-0400 myApp[1440:328144] assetsFetchResult PHFetchResult: 0x281129900 count=0
2019-07-19 09:26:36.366982-0400 myApp[1440:328144] asset: (null)
2019-07-19 09:26:36.367061-0400 myApp[1440:328144] can't retrieve PHAsset from localIdentifier:(
"857AC3DA-C047-4D88-911B-C5FE227E2B96/L0/001"
)
The problem is that you should drop /L0/001 part before requesting the object. I don't understand why uuid strings contain these suffixes, but it works if you drop it and use only normal UUID part.

PHAsset assetResourcesForAsset fails when called too often

I need to retrieve the names of all the PHAsset existing in the Camera Roll, individually and in a short time.
To get the file name, I use the documented originalFilename property of PHAssetResource associated to the PHAsset.
This works fine for the first assets, but at some point (after around 400 assets), it starts failing and returning nil every time.
Here is a code that shows this behavior (running on an iPhone 7 with ~800 photos in the Camera Roll):
PHFetchResult *result = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum
subtype:PHAssetCollectionSubtypeSmartAlbumUserLibrary options:nil];
PHAssetCollection *assetCollection = result.firstObject;
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
for (int i = index; i<[assets count]; i++) {
PHAsset *asset = assets[i];
NSArray *resources = [PHAssetResource assetResourcesForAsset:asset];
NSString *name = (resources.count > 0) ? [(PHAssetResource*)resources.firstObject originalFilename] : nil;
NSLog(#"%i: %#", i, name);
}
When using undocumented methods to get the file name, such as [asset valueForKey#"filenamme"] or the PHImageFileURLKey key of the info dictionary returned by the PHImageManager, everything works well (although the name is different than with the originalFilename and well, it's not reliable since it's not documented).
How come the official method is that unreliable?
Is there something I do wrong?

How to get only images in the camera roll using Photos Framework

The following code loads images that are also located on iCloud or the streams images. How can we limit the search to only images in the camera roll?
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: nil)
After adding the Camera Roll and Photo Stream albums, Apple added the following PHAssetCollectionSubtype types in iOS 8.1:
PHAssetCollectionSubtypeAlbumMyPhotoStream (together with PHAssetCollectionTypeAlbum) - fetches the Photo Stream album.
PHAssetCollectionSubtypeSmartAlbumUserLibrary (together with PHAssetCollectionTypeSmartAlbum) - fetches the Camera Roll album.
Haven't tested if this is backward-compatible with iOS 8.0.x though.
Through some experimentation we discovered a hidden property not listed in the documentation (assetSource). Basically you have to do a regular fetch request, then use a predicate to filter the ones from the camera roll. This value should be 3.
Sample code:
//fetch all assets, then sub fetch only the range we need
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)
assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in
results.addObject(obj)
}
var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %#", argumentArray: [3]))
results = NSMutableArray(array: cameraRollAssets)
If you are searching like me for Objective C code, and also you didn't get Answer of new library/ Photo Framework as you were getting deprecated AssetsLibrary's code , Then this will help you:
Swift
Global Variables:
func getAllPhotosFromCameraRoll() -> [UIImage] {
// TODO: Add `NSPhotoLibraryUsageDescription` to info.plist
PHPhotoLibrary.requestAuthorization { print($0) } // TODO: Move this line of code to somewhere before attempting to access photos
var images = [UIImage]()
let requestOptions: PHImageRequestOptions = PHImageRequestOptions()
requestOptions.resizeMode = .exact
requestOptions.deliveryMode = .highQualityFormat
requestOptions.isSynchronous = true
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: .image, options: nil)
let manager: PHImageManager = PHImageManager.default()
for i in 0..<fetchResult.count {
let asset = fetchResult.object(at: i)
manager.requestImage(
for: asset,
targetSize: PHImageManagerMaximumSize,
contentMode: .default,
options: requestOptions,
resultHandler: { (image: UIImage?, info: [AnyHashable: Any]?) -> Void in
if let image = image {
images.append(image)
}
})
}
return images
}
Objective C
Global Variables:
NSArray *imageArray;
NSMutableArray *mutableArray;
below method will help you:
-(void)getAllPhotosFromCamera
{
imageArray=[[NSArray alloc] init];
mutableArray =[[NSMutableArray alloc]init];
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
requestOptions.synchronous = true;
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
PHImageManager *manager = [PHImageManager defaultManager];
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[result count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in result) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[images addObject:ima];
}];
}
imageArray = [images copy]; // You can direct use NSMutuable Array images
}
If you use your own PHCachingImageManager instead of the shared PHImageManager instance then when you call requestImageForAsset:targetSize:contentMode:options:resultHandler: you can set an option in PHImageRequestOptions to specify that the image is local.
networkAccessAllowed
Property
A Boolean value that specifies whether Photos can download the requested image from iCloud.
networkAccessAllowed
Discussion
If YES, and the requested image is not stored on the local device, Photos downloads the image from iCloud. To be notified of the download’s progress, use the progressHandler property to provide a block that Photos calls periodically while downloading the image. If NO (the default), and the image is not on the local device, the PHImageResultIsInCloudKey value in the result handler’s info dictionary indicates that the image is not available unless you enable network access.
This can help. You can use your own data model instead of AlbumModel I used.
func getCameraRoll() -> AlbumModel {
var cameraRollAlbum : AlbumModel!
let cameraRoll = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil)
cameraRoll.enumerateObjects({ (object: AnyObject!, count: Int, stop: UnsafeMutablePointer) in
if object is PHAssetCollection {
let obj:PHAssetCollection = object as! PHAssetCollection
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let assets = PHAsset.fetchAssets(in: obj, options: fetchOptions)
if assets.count > 0 {
let newAlbum = AlbumModel(name: obj.localizedTitle!, count: assets.count, collection:obj, assets: assets)
cameraRollAlbum = newAlbum
}
}
})
return cameraRollAlbum
}
Here is Objective- c version provided by apple.
-(NSMutableArray *)getNumberOfPhotoFromCameraRoll:(NSArray *)array{
PHFetchResult *fetchResult = array[1];
int index = 0;
unsigned long pictures = 0;
for(int i = 0; i < fetchResult.count; i++){
unsigned long temp = 0;
temp = [PHAsset fetchAssetsInAssetCollection:fetchResult[i] options:nil].count;
if(temp > pictures ){
pictures = temp;
index = i;
}
}
PHCollection *collection = fetchResult[index];
if (![collection isKindOfClass:[PHAssetCollection class]]) {
// return;
}
// Configure the AAPLAssetGridViewController with the asset collection.
PHAssetCollection *assetCollection = (PHAssetCollection *)collection;
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
self. assetsFetchResults = assetsFetchResult;
self. assetCollection = assetCollection;
self.numberOfPhotoArray = [NSMutableArray array];
for (int i = 0; i<[assetsFetchResult count]; i++) {
PHAsset *asset = assetsFetchResult[i];
[self.numberOfPhotoArray addObject:asset];
}
NSLog(#"%lu",(unsigned long)[self.numberOfPhotoArray count]);
return self.numberOfPhotoArray;
}
Where you can grab following details
PHFetchResult *fetchResult = self.sectionFetchResults[1];
PHCollection *collection = fetchResult[6];
**value 1,6 used to get camera images**
**value 1,0 used to get screen shots**
**value 1,1 used to get hidden**
**value 1,2 used to get selfies**
**value 1,3 used to get recently added**
**value 1,4 used to get videos**
**value 1,5 used to get recently deleted**
**value 1,7 used to get favorites**
Apple demo link
Declare your property
#property (nonatomic, strong) NSArray *sectionFetchResults;
#property (nonatomic, strong) PHFetchResult *assetsFetchResults;
#property (nonatomic, strong) PHAssetCollection *assetCollection;
#property (nonatomic, strong) NSMutableArray *numberOfPhotoArray;
I've been banging my head over this too. I've found no way to filter for only assets on the device with fetchAssetsWithMediaType or fetchAssetsInAssetCollection. I'm able to use requestContentEditingInputWithOptions or requestImageDataForAsset to determine if the asset is on the device or not, but this is asynchronous and seems like it's using way too much resources to do for every asset in the list. There must be a better way.
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
for (int i=0; i<[fetchResult count]; i++) {
PHAsset *asset = fetchResult[i];
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
if ([[info objectForKey:PHContentEditingInputResultIsInCloudKey] intValue] == 1) {
NSLog(#"asset is in cloud");
} else {
NSLog(#"asset is on device");
}
}];
}
If you don't want to rely on an undocumented API, look at [asset canPerformEditOperation:PHAssetEditOperationContent]. This only returns true if the full original is available on device.
Admittedly this is also fragile, but testing shows it works for all of the assetSource types (photostream, iTunes sync, etc).

Reading video metadata on iOS 7 - always empty

I am exporting a Quicktime video using AVExporterSession and setting the metadata on it as follows:
AVMutableMetadataItem *newMetaDataCommentItem = [[AVMutableMetadataItem alloc] init];
[newMetaDataCommentItem setKeySpace:AVMetadataKeySpaceQuickTimeMetadata];
[newMetaDataCommentItem setKey:AVMetadataQuickTimeMetadataKeyComment];
[newMetaDataCommentItem setValue:#"Test metadata value"];
NSMutableArray *metaData = [NSMutableArray array];
[metaData addObject:newMetaDataCommentItem];
exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=[[SNMovieManager instance] urlForFinalMovie];
exporter.metadata = metaData;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = video;
I then import the file video to my Mac and run mdls on it and see the value has been set correctly: kMDItemComment = "Test metadata value"
The bit I can't do is read that value back. I am using the following to read the file. The asset is correct but the metadata property is always an empty dictionary.
[group enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
if([[result valueForProperty:#"ALAssetPropertyType"] isEqualToString:#"ALAssetTypeVideo"])
{
ALAssetRepresentation *rep = result.defaultRepresentation;
NSDictionary *metadata = rep.metadata;
[images addObject:(id)rep.fullScreenImage];
}
Does anyone know if I am taking the correct approach here and if not let me know what the correct approach to read this comment back out is?
Thanks
Simon
I would be highly appreciated if you can provide more code base related to the PhotoLibrary save process.
Otherwise only one answer, Metadata will return nil if the representation is one that the system cannot interpret.
The returned dictionary holds the properties of the video at a specified location in an file source.
I'think your problem is on getting metadata script
You should get an AVURLAsset first and get metadata from it ALAssetRepresentation metadata is different
[group enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
if([[result valueForProperty:#"ALAssetPropertyType"] isEqualToString:#"ALAssetTypeVideo"])
{
AVURLAsset *videoAset = [AVURLAsset assetWithURL:[[asset defaultRepresentation] url]];
if ([[videoAset metadataForFormat:AVMetadataFormatQuickTimeMetadata] count]) {
AVMutableMetadataItem *meta = [[videoAset metadataForFormat:AVMetadataFormatQuickTimeUserData] objectAtIndex:0];
NSLog(#"%#",meta);
NSLog(#"%lu",(unsigned long)[[videoAset metadataForFormat:AVMetadataFormatQuickTimeMetadata] count]);
}
}

Resources