How to convert PHAsset to PHLivePhoto - ios

I have PHAsset and I want to get PHLivePhoto.
PHAsset is PHLivePhoto's asset.
I know this Function.
/// Requests a Live Photo from the given resource URLs. The result handler will be called multiple times to deliver new PHLivePhoto instances with increasingly more content. If a placeholder image is provided, the result handler will first be invoked synchronously to deliver a live photo containing only the placeholder image. Subsequent invocations of the result handler will occur on the main queue.
// The targetSize and contentMode parameters are used to resize the live photo content if needed. If targetSize is equal to CGRectZero, content will not be resized.
// When using this method to provide content for a PHLivePhotoView, each live photo instance delivered via the result handler should be passed to - [PHLivePhotoView setLivePhoto:].
+ (PHLivePhotoRequestID)requestLivePhotoWithResourceFileURLs:(NSArray<NSURL *> *)fileURLs placeholderImage:(UIImage *__nullable)image targetSize:(CGSize)targetSize contentMode:(PHImageContentMode)contentMode resultHandler:(void(^)(PHLivePhoto *__nullable livePhoto, NSDictionary *info))resultHandler;
But I don't know use.
How to convert?

You can use this method from PHImageManager
- (PHImageRequestID)requestLivePhotoForAsset:(PHAsset *)asset targetSize:(CGSize)targetSize contentMode:(PHImageContentMode)contentMode options:(PHLivePhotoRequestOptions *)options resultHandler:(void (^)(PHLivePhoto *livePhoto, NSDictionary *info))resultHandler
Inside the resultHandler you obtain the PHLivePhoto ready to show in a PHLivePhotoView.
EXAMPLE:
[[PHImageManager defaultManager] requestLivePhotoForAsset:asset targetSize:self.contentView.frame.size contentMode:PHImageContentModeDefault options:nil resultHandler:^(PHLivePhoto * _Nullable livePhoto, NSDictionary * _Nullable info) { self.contentView.livePhoto = livePhoto }];

[asset requestContentEditingInputWithOptions:nil completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
PHLivePhoto *livePhoto = [contentEditingInput.livePhoto];
}];

Related

Very slow to load images from PHAsset

I am using following code to fetch images and AVAsset from PHAsset. Here are two arrays in code :
galleryArr : to store images for collection view.
mutableDataArr : store images (for image asset) and videos (for AVAsset) to upload on server
Its very slow to fetch all images from PHAssets array.
I googled about this, most of people says remove this line [options setSynchronous:YES]; but if I remove this line then completion is called twice and array duplicates the objects (as objects are appended in array within completion).
for (int i = 0; i < assets.count; i++) {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
options.resizeMode = PHImageRequestOptionsResizeModeExact;
[options setNetworkAccessAllowed:YES];
[options setSynchronous:YES];
PHImageManager *manager = PHImageManager.defaultManager;
PHVideoRequestOptions *videoOptions = [[PHVideoRequestOptions alloc] init];
videoOptions.networkAccessAllowed = YES;
__weak typeof(self) weakSelf = self;
if (assets[i].mediaType == PHAssetMediaTypeVideo) {
[manager requestAVAssetForVideo:[assets objectAtIndex:i] options:videoOptions resultHandler:^(AVAsset * _Nullable asset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
if ([asset isKindOfClass:[AVURLAsset class]])
{
[weakSelf.mutableDataArr addObject:asset];
}
}];
}
[manager requestImageForAsset:[assets objectAtIndex:i]
targetSize: CGSizeMake(1024, 1024) //PHImageManagerMaximumSize
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *image, NSDictionary *info) {
if (image) {
dispatch_async(dispatch_get_main_queue(), ^{
if (assets[i].mediaType != PHAssetMediaTypeVideo) {
[weakSelf.mutableDataArr addObject:image];
}
[galleryArr addObject:image];
if (i+1 == assets.count) {
[SVProgressHUD dismiss];
[weakSelf.galleryCollectionView reloadData];
}
});
}
}];
});
}
Any suggestion please?
Just one thought, it looks like you are loading all the images from the array before removing your progress HUD and displaying the gallery. As the number of images could be very large and presuming you are using a collection view or similar, that's quite an overhead before anything is displayed.
I did something like this a while ago and instead of looping through the array and loading everything up front, I let the cells request images as they needed them. This makes it very fast and efficient as cells can display immediately with a loading icon, then flip to the image when it was available. Efficiency comes from only loading images the user is actually going to see.
To make things performant, and by performant I mean I could scroll as fast as I liked without the display freezing, each cell would first check an in memory cache for the image, then trigger a request for an image on a background thread.
When the image was returned, the cell would add it to the in memory cache and then if the cell had not being reused for a different image (due to fast scrolling) it would display the image.
Further, I also used a NSCache for the in memory cache so that if the app started to use a lot of memory, images would be automatically dropped and reloaded the next time a cell wanted one.
The summary is to use a memory aware cache, and only load what you actually need.

How to determine the end of all asynchronous calls from within a loop?

I have an app which combines multiple videos, the initial list of PHAssets are displayed and selected to form an array of PHAssets. Now on the screen which creates the video I need to loop through and fetch the AVAsset from the PHAsset.
The issue I am trying to understand is how to track the progress and determine the end of all the asynchronous fetches. When the loop is complete I can move onto actually combining all the videos.
for (PHAsset * object in self.arraySelectedAssets) {
[[PHImageManager defaultManager] requestAVAssetForVideo:object options:nil resultHandler:^(AVAsset *avAsset, AVAudioMix *audioMix, NSDictionary *info) {
NSLog(#"Fetched");
//here asset in nil! IOS 10 only, IOS 11 works fine
AVURLAsset * assetUrl = (AVURLAsset*)avAsset;
}];
}
You can have store the length of self.arraySelectedAssets and have a counter var declared outside the loop.
Then, in each callback you increment the counter variable and also add an if to check if the counter is equal to the length.
In that case you know that you have all the assets.
int total = [self.arraySelectedAssets count];
int count = 0;
for (PHAsset * object in self.arraySelectedAssets) {
[[PHImageManager defaultManager] requestAVAssetForVideo:object options:nil resultHandler:^(AVAsset *avAsset, AVAudioMix *audioMix, NSDictionary *info) {
NSLog(#"Fetched");
//here asset in nil! IOS 10 only, IOS 11 works fine
AVURLAsset * assetUrl = (AVURLAsset*)avAsset;
count ++;
if (count == total) {
//do your stuff
}
}];
}

Cannot convert UIImage to NSData when fetched with PhotoKit

My scenario is that I select one of the photos fetched from system album with PhotoKit and presented in my UICollectionView and pass the selected photo (UIImage) to my next UIView and send it to remote server in form of NSData.
But when I put breakpoint before sending to track the NSData I found that the UIImage had data with allocated memory while NSData did not.
Here is the code to fetch the UIImage (Please notice that I didn't specify any PHImageRequestOptions object):
NSMutableArray *images = #[].mutableCopy;
PHImageManager *manager = [PHImageManager defaultManager];
PHAsset *asset = _photoPickerCollectionView.pickedAsset;
CGSize targetSize = CGSizeMake(asset.pixelWidth*0.5, asset.pixelHeight*0.5);
[manager requestImageForAsset:asset targetSize:targetSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info) {
_nextView.pickedOne = result;
}];
[self.navigationController dismissViewControllerAnimated:YES completion:nil];
I convert the UIImage to NSData like this:
UIImage *image = _pickedOne;
NSData *imgData = UIImageJPEGRepresentation(image, imageQuality);
When I tracked the variables, the UIImage contains data as below:
But the NSData is nil as below:
And what was ODD is that if I specified an PHImageRequestOptions object for the PHImageManager to request images, the NSData wouldn't be nil. I'm not sure what was changed with or without the PHImageRequestOptions and why it would make such difference.
UPDATE:
What I found is that if you specify PHImageRequestOptions to nil then the default options will force it to fetch photos asynchronously which, in my opinion, can be unstable for NSData, so when I specify options.synchronous = YES; it would work.
But in this case, would it cause any retain cycle or some PhotoKit objects won't get released?
The method by default is asynchronous. You need to set options to explicitly make the image fetching synchronous. My strong suggestion for such kind of conversion is to use this API:
- (PHImageRequestID)requestImageDataForAsset:(PHAsset *)asset options:(nullable PHImageRequestOptions *)options resultHandler:(void(^)(NSData *__nullable imageData, NSString *__nullable dataUTI, UIImageOrientation orientation, NSDictionary *__nullable info))resultHandler;
This returns you the data directly but you will have to make this synchronous as well. This WON'T cause any retain cycles for any objects.
Hope this helps. :)

requestImageDataForAsset returns nil image data

I've been stuck with this for couple of days. I've bee trying to get the image within the call back but I always get nil. These the options which I used:
let options = PHImageRequestOptions()
options.deliveryMode = .HighQualityFormat
options.resizeMode = .None
I also tried with options set to nil without any luck. This is the data I got in the info value passed to the block.
[PHImageResultIsInCloudKey: 0,
PHImageResultDeliveredImageFormatKey: 9999,
PHImageFileURLKey: file:///var/mobile/Media/DCIM/100APPLE/IMG_0052.JPG,
PHImageResultRequestIDKey: 84,
PHImageResultIsDegradedKey: 0,
PHImageResultWantedImageFormatKey: 9999,
PHImageResultIsPlaceholderKey: 0,
PHImageFileSandboxExtensionTokenKey:
64b47b046511a340c57aa1e3e6e07994c1a13853;00000000;00000000;0000001a;com.apple.app-sandbox.read;;00000000;00000000;0000000000000000;/private/var/mobile/Media/DCIM/100APPLE/IMG_0051.JPG]
I also tried using requestImageForAsset and I got the same result. I thought by using requestImageDataForAsset I'll get more control on the data.
Also, I thought the file exists in the cloud, but it is not as PHImageResultIsInCloudKey value is set 0; otherwise, I'd download it.
by the way, I am able to get a smaller version of the image with predefined size of 200x200 inside another view; however, when I try to get the larger version of it, I get nil. I know that the image exists on the phone with higher res (I can see it in the Photos app)
Any help will be appreciated.
I had a similar issue and tried setting the networkAccessAllowed option on the PHImageRequestOptions object to YES - that seemed to fix it. For me the issue was definitely that the images were in the cloud as images on the camera roll worked fine but those in the cloud did not.
I have the same question. I think there is a bug in PHImageRequestOptions class , so we pass nil in the bellow code ,it's helpful for me.
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
assetModel.size = imageData.length;
NSString *filename = [asset valueForKey:#"filename"];
assetModel.fileName = filename;
dispatch_semaphore_signal(sema);
}];
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);

How to know if a PHAsset has been modified?

More specifically, how can you know whether a PHAsset has current version of the underlying asset different than the original?
My user should only need to choose between the current or original asset when necessary. And then I need their answer for PHImageRequestOptions.version.
As of iOS 16, PHAsset has a hasAdjustments property which indicates if the asset has been edited.
For previous releases, you can get an array of data resources for a given asset via PHAssetResource API - it will have an adjustment data resource if that asset has been edited.
let isEdited = PHAssetResource.assetResources(for: asset).contains(where: { $0.type == .adjustmentData })
Note that if you want to actually work with a resource file, you have to fetch its data using a PHAssetResourceManager API. Also note that this method returns right away - there's no waiting for an async network request, unlike other answers here.
I have found two ways of checking PHAsset for modifications.
- (void)tb_checkForModificationsWithEditingInputMethodCompletion:(void (^)(BOOL))completion {
PHContentEditingInputRequestOptions *options = [PHContentEditingInputRequestOptions new];
options.canHandleAdjustmentData = ^BOOL(PHAdjustmentData *adjustmentData) { return YES; };
[self requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
if (completion) completion(contentEditingInput.adjustmentData != nil);
}];
}
- (void)tb_checkForModificationsWithAssetPathMethodCompletion:(void (^)(BOOL))completion {
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.deliveryMode = PHVideoRequestOptionsDeliveryModeFastFormat;
[[PHImageManager defaultManager] requestAVAssetForVideo:self options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if (completion) completion([[asset description] containsString:#"/Mutations/"]);
}];
}
EDIT: I was at the point where I needed the same functionality for PHAsset with an image. I used this:
- (void)tb_checkForModificationsWithAssetPathMethodCompletion:(void (^)(BOOL))completion {
[self requestContentEditingInputWithOptions:nil completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSString *path = (contentEditingInput.avAsset) ? [contentEditingInput.avAsset description] : contentEditingInput.fullSizeImageURL.path;
completion([path containsString:#"/Mutations/"]);
}];
}
Take a look at PHImageRequestOptionsVersion
PHImageRequestOptionsVersionCurrent
Request the most recent version of the image asset (the one that reflects all edits).
The resulting image is the rendered output from all previously made adjustments.
PHImageRequestOptionsVersionUnadjusted
Request a version of the image asset without adjustments.
If the asset has been edited, the resulting image reflects the state of the asset before any edits were performed.
PHImageRequestOptionsVersionOriginal
Request the original, highest-fidelity version of the image asset. The
resulting image is originally captured or imported version of the
asset, regardless of any edits made.
If you ask user before retrieving assets, you know which version user specified. If you get a phasset from elsewhere, you can do a revertAssetContentToOriginal to get the original asset. And PHAsset has modificationDate and creationDate properties, you can use this to tell if a PHAsset is modified.
I found this code the only one working for now, and it handles most of the edge cases. It may not be the fastest one but works well for most images types. It takes the smallest possible original and modified image and compare their data content.
#implementation PHAsset (Utilities)
- (void)checkEditingHistoryCompletion:(void (^)(BOOL edited))completion
{
PHImageManager *manager = [PHImageManager defaultManager];
CGSize compareSize = CGSizeMake(64, 48);
PHImageRequestOptions *requestOptions = [PHImageRequestOptions new];
requestOptions.synchronous = YES;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat;
requestOptions.version = PHImageRequestOptionsVersionOriginal;
[manager requestImageForAsset:self
targetSize:compareSize
contentMode:PHImageContentModeAspectFit
options:requestOptions
resultHandler:^(UIImage *originalResult, NSDictionary *info) {
UIImage *currentImage = originalResult;
requestOptions.version = PHImageRequestOptionsVersionCurrent;
[manager requestImageForAsset:self
targetSize:currentImage.size
contentMode:PHImageContentModeAspectFit
options:requestOptions
resultHandler:^(UIImage *currentResult, NSDictionary *info) {
NSData *currData = UIImageJPEGRepresentation(currentResult, 0.1);
NSData *orgData = UIImageJPEGRepresentation(currentImage, 0.1);
if (completion) {
//handle case when both images cannot be retrived it also mean no edition
if ((currData == nil) && (orgData == nil)) {
completion(NO);
return;
}
completion(([currData isEqualToData:orgData] == NO));
}
}];
}];
}
#end

Resources