Hundreds of pictures in UIImage array - memory leak - ios

I am working on an app which creates frames out of the recorded video:
var videoFrames:[UIImage] = [UIImage]()
func loadImages(){
let generator = AVAssetImageGenerator(asset: asset)
generator.generateCGImagesAsynchronously(forTimes: capturedFrames, completionHandler: {requestedTime, image, actualTime, result, error in
DispatchQueue.main.async {
if let image = image {
self.videoFrames.append(UIImage(cgImage: image)) }
}
})
}
Codes works fine for up to +/- 300 images loaded.
When there's more, app is Terminated due to memory issue - I am fairly new to swift - how can I debug it further?
Is there any better way to store so many images? Will splitting into couple arrays fix the issue?
My goal is to store thousands of photos (up to 1920x1080) efficiently - maybe you can recommend some better method?

Write the image to the disk and Maintain a database with image name and path.
if let image = image {
let uiImage = UIImage(cgImage: image)
let fileURL = URL(fileURLWithPath: ("__file_path__" + "\(actualTime).png"))
uiImage.pngData()!.write(to: fileURL)
//write filepath and image name to database
}

I'm adding some code of mine, it's an old code that I have in a never published app. It's in objC but the concepts are still valid, the main difference between the other code posted is that the handler takes also in consideration the orientation of the captured video, of course you must give a value to the orientation variable.
__block int i = 0;
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result == AVAssetImageGeneratorSucceeded) {
NSMutableDictionary * metadata = #{}.mutableCopy;
[metadata setObject:#(recordingOrientation) forKey:(NSString*)kCGImagePropertyOrientation];;
NSString * path = [mainPath stringByAppendingPathComponent:[NSString stringWithFormat:#"Image_%.5ld.jpg",(long)i]];
CFURLRef url = (__bridge_retained CFURLRef)[NSURL fileURLWithPath:path];
CFMutableDictionaryRef metadataImage = (__bridge_retained CFMutableDictionaryRef) metadata;
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypeJPEG, 1, NULL);
CGImageDestinationAddImage(destination, im, metadataImage);
if (!CGImageDestinationFinalize(destination)) {
DLog(#"Failed to write image to %#", path);
}
else {
DLog(#"Writing image to %#", path);
}
}
if (result == AVAssetImageGeneratorFailed) {
//DLog(#"Failed with error: %# code %d", [error localizedDescription],error.code)
DLog(#"Failed with error: %# code %ld for CMTime requested %# and CMTime actual %#", [error localizedDescription],(long)error.code, CFAutorelease( CMTimeCopyDescription(kCFAllocatorDefault, requestedTime)), CFAutorelease(CMTimeCopyDescription(kCFAllocatorDefault,actualTime)));
DLog(#"Asset %#",videoAsset);
}
if (result == AVAssetImageGeneratorCancelled) {
NSLog(#"Canceled");
}
++i;
}

Related

Get Disparity or Depth Data from PHImageManager

I try to get disparity or depth map from PHAsset. I've found example where the map is got from a image loaded with PHImageMager and implemented it:
- (AVDepthData*)getDepthDataFromSource:(CGImageSourceRef)source
{
NSDictionary* depthData = CFBridgingRelease(CGImageSourceCopyAuxiliaryDataInfoAtIndex(source, 0, kCGImageAuxiliaryDataTypeDepth));
if (!depthData){
depthData = CFBridgingRelease(CGImageSourceCopyAuxiliaryDataInfoAtIndex(source, 0, kCGImageAuxiliaryDataTypeDisparity));
}
if (!depthDat)
return nil; //code returns here
NSError* creationError = nil;
AVDepthData* data = [AVDepthData depthDataFromDictionaryRepresentation:depthData
error:&creationError];
return data;
}
//from a image
[[PHImageManager defaultManager] requestImageForAsset:asset
targetSize:size
contentMode:contentMode
options:requestOptions
resultHandler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
CGImageRef im = image.CGImage;
CGImageSourceRef source = CGImageSourceCreateWithDataProvider(CGImageGetDataProvider(im), (CFDictionaryRef)#{});
AVDepthData* data = [self getDepthDataFromSource:source];//data is nil
if (source != nil)
{
CFRelease(source);
}
}];
//from a data
PHImageRequestOptions* imageDataRequestOptions = [[PHImageRequestOptions alloc] init];
imageDataRequestOptions.networkAccessAllowed = YES;
imageDataRequestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
PHImageRequestID requestId = [[PHImageManager defaultManager] requestImageDataForAsset:asset
options:imageDataRequestOptions
resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)imageData, (CFDictionaryRef)#{});
AVDepthData* data = [self getDepthDataFromSource:source];//data is nil
if (source != nil)
{
CFRelease(source);
}
}
];
//from full resolution image
__block FADisparityDataReader* selfStong = self;
PHContentEditingInputRequestOptions* options = [[PHContentEditingInputRequestOptions alloc] init];
options.networkAccessAllowed = YES;
[asset requestContentEditingInputWithOptions:options
completionHandler:^(PHContentEditingInput * _Nullable contentEditingInput, NSDictionary * _Nonnull info) {
NSURL* fullSizePath = [contentEditingInput fullSizeImageURL];
CGImageSourceRef source = CGImageSourceCreateWithURL((CFURLRef)fullSizePath, (CFDictionaryRef)#{});
#onExit{
CFRelease(source);
};
AVDepthData* data = [self readDepthDataFromSource:source]; //data is not nil
complition(data);
}];
I get depth data only in requestContentEditingInputWithOptions block, but it is too long and I believe that I can get a deep map from PHImageManager images.
How I can get the data from PHImageManager?
The 2017 WWDC session 'Editing with Depth' has some guidance, but does not have a full code example.
You can use the Apple sample app 'PhotoBrowse' to test the methods to access the depth data.
See my modified PhotoBrowse that loads depth and disparity images. It uses the Accelerate framework vDSP vector functions to normalize disparity to 0..1 rang
https://github.com/racewalkWill/PhotoBrowseModified
#IBAction func showDepthBtn(_ sender: UIBarButtonItem) {
// show the depth data from portrait mode from
// iPhone 7 Plus and later camera
requestDepthMap(selectedAsset: asset)
}
func requestDepthMap(selectedAsset: PHAsset) {
// may not have depthData in many cases
// PH completionHandler may be invoked multiple times
var auxImage: CIImage?
let options = PHContentEditingInputRequestOptions()
selectedAsset.requestContentEditingInput(with: options, completionHandler: { input, info in
guard let input = input
else { NSLog ("contentEditingInput not loaded")
return
}
auxImage = CIImage(contentsOf: input.fullSizeImageURL!, options: [CIImageOption.auxiliaryDepth: true])
if auxImage != nil {
let uiImage = UIImage(ciImage: auxImage! )
self.imageView.image = uiImage
}
} )
}
The key point is the PHAsset requestContentEditingInput completion handler is used to get the auxiliaryDepth (or disparity) ciImage.
The session demo discussed the scaling and normalization but did not publish any sample code or app. See my modified PhotoBrowse app for normalizing the CVPixelBuffer depth data.
Also be sure to use the iPhone portrait mode to take a picture with depth. Almost all the images in my personal photo library do not have depth data.
It's been a while since your question, so you may have worked this out already.

Memory issue while fetching images from Photos library with metadata

I'm trying to get all photos from photos library with image's metadata. It works fine for 10-20 images but when there are 50+ images it occupies too much memory, which causes to app crash.
Why i need all images into array?
Answer - to send images to server app. [i'm using GCDAsyncSocket to send data on receiver socket/port and i don't have that much waiting time to request images from PHAsset while sending images on socket/port.
My Code :
+(void)getPhotosDataFromCamera:(void(^)(NSMutableArray *arrImageData))completionHandler
{
[PhotosManager checkPhotosPermission:^(bool granted)
{
if (granted)
{
NSMutableArray *arrImageData = [NSMutableArray new];
NSArray *arrImages=[[NSArray alloc] init];
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
arrImages = [result copy];
//--- If no images.
if (arrImages.count <= 0)
{
completionHandler(nil);
return ;
}
__block int index = 1;
__block BOOL isDone = false;
for (PHAsset *asset in arrImages)
{
[PhotosManager requestMetadata:asset withCompletionBlock:^(UIImage *image, NSDictionary *metadata)
{
#autoreleasepool
{
NSData *imageData = metadata?[PhotosManager addExif:image metaData:metadata]:UIImageJPEGRepresentation(image, 1.0f);
if (imageData != nil)
{
[arrImageData addObject:imageData];
NSLog(#"Adding images :%i",index);
//--- Done adding all images.
if (index == arrImages.count)
{
isDone = true;
NSLog(#"Done adding all images with info!!");
completionHandler(arrImageData);
}
index++;
}
}
}];
}
}
else
{
completionHandler(nil);
}
}];
}
typedef void (^PHAssetMetadataBlock)(UIImage *image,NSDictionary *metadata);
+(void)requestMetadata:(PHAsset *)asset withCompletionBlock:(PHAssetMetadataBlock)completionBlock
{
PHContentEditingInputRequestOptions *editOptions = [[PHContentEditingInputRequestOptions alloc]init];
editOptions.networkAccessAllowed = YES;
[asset requestContentEditingInputWithOptions:editOptions completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info)
{
CIImage *CGimage = [CIImage imageWithContentsOfURL:contentEditingInput.fullSizeImageURL];
UIImage *image = contentEditingInput.displaySizeImage;
dispatch_async(dispatch_get_main_queue(), ^{
completionBlock(image,CGimage.properties);
});
CGimage = nil;
image = nil;
}];
editOptions = nil;
asset =nil;
}
+ (NSData *)addExif:(UIImage*)toImage metaData:(NSDictionary *)container
{
NSData *imageData = UIImageJPEGRepresentation(toImage, 1.0f);
// create an imagesourceref
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);
// this is the type of image (e.g., public.jpeg)
CFStringRef UTI = CGImageSourceGetType(source);
// create a new data object and write the new image into it
NSMutableData *dest_data = [[NSMutableData alloc] initWithLength:imageData.length+2000];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data, UTI, 1, NULL);
if (!destination) {
NSLog(#"Error: Could not create image destination");
}
// add the image contained in the image source to the destination, overidding the old metadata with our modified metadata
CGImageDestinationAddImageFromSource(destination, source, 0, (__bridge CFDictionaryRef) container);
BOOL success = NO;
success = CGImageDestinationFinalize(destination);
if (!success) {
NSLog(#"Error: Could not create data from image destination");
}
CFRelease(destination);
CFRelease(source);
imageData = nil;
source = nil;
destination = nil;
return dest_data;
}
Well it's not a surprise that you arrive into this situation, since each of your image consumes memory and you instantiate and keep them in memory. This is not really a correct design approach.
In the end it depends on what you want to do with those images.
What I would suggest is that you keep just the array of your PHAsset objects and request the image only on demand.
Like if you want to represent those images into a tableView/collectionView, perform the call to
[PhotosManager requestMetadata:asset withCompletionBlock:^(UIImage *image, NSDictionary *metadata)
directly in the particular method. This way you won't drain the device memory.
There simply is not enough memory on the phone to load all of the images into the photo library into memory at the same time.
If you want to display the images, then only fetch the images that you need for immediate display. For the rest keep just he PHAsset. Make sure to discard the images when you don't need them any more.
If you need thumbnails, then fetch only the thumbnails that you need.
If want to do something with all of the images - like add a watermark to them or process them in some way - then process each image one at a time in a queue.
I cannot advise further as your question doesn't state why you need all of the images.

How to properly detect a PHAsset's file type (GIF)

I have no idea why this is so difficult. I'm trying to determine the file type of a PHAsset, specifically, I want to know if a given asset represents a GIF image or not.
Simply inspecting the asset's filename tells me it's an MP4:
[asset valueForKey:#"filename"] ==> "IMG_XXXX.MP4"
Does iOS convert GIF's to videos when saved to the devices image library? I've also tried fetching the image's data and looking at it's dataUTI, but it just returns nil for GIF's (I'm assuming all videos as well). I'm fetching the image data as follows:
PHImageManager *manager = asset.imageManager ? asset.imageManager : [PHImageManager defaultManager];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
PHImageRequestOptions *o = [[PHImageRequestOptions alloc] init];
o.networkAccessAllowed = YES;
[manager requestImageDataForAsset:asset.asset options:o resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
dispatch_async(dispatch_get_main_queue(), ^{
CIImage *ciImage = [CIImage imageWithData:imageData];
if(completion) completion(imageData, dataUTI, orientation, info, ciImage.properties);
});
}];
});
the dataUTI returned from the above call is nil.
If anyone knows of a reliable way to determine a PHAsset's file type (I'm specifically looking for GIF's, but being able to determine for any type of file would be great) let me know!
Use PHAssetResource.
NSArray *resourceList = [PHAssetResource assetResourcesForAsset:asset];
[resourceList enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
PHAssetResource *resource = obj;
if ([resource.uniformTypeIdentifier isEqualToString:#"com.compuserve.gif"]) {
isGIFImage = YES;
}
}];
Also you can find uniformTypeIdentifier from PHContentEditingInput class. For this; use requestContentEditingInput function from PHAsset
Don't forget to
import MobileCoreServices for kUTTypeGif
Sample Swift 3.1 code:
let options = PHContentEditingInputRequestOptions()
options.isNetworkAccessAllowed = true //for icloud backup assets
let asset : PHAsset = ..... //sampleAsset
asset.requestContentEditingInput(with: options) { (contentEditingInput, info) in
if let uniformTypeIdentifier = contentEditingInput?.uniformTypeIdentifier {
if uniformTypeIdentifier == (kUTTypeGIF as String) {
debugPrint("This asset is a GIF👍")
}
}
}
For Swift 3.0 and above
import MobileCoreServices
var isGIFImage = false
if let identifier = asset.value(forKey: "uniformTypeIdentifier") as? String
{
if identifier == kUTTypeGIF as String
{
isGIFImage = true
}
}
I guess since iOS 11, we can use:
if asset.playbackStyle == .imageAnimated {
// try to show gif animation
}
First of all, I am not sure what do you mean by the GIF image.
Are you referring to Live Photo or Time-lapse ?
However, if you want to check the current asset is Live Photo, Time-lapse, then you can check like this
if(asset.mediaSubtypes == PHAssetMediaSubtypePhotoLive)
{
// this is a Live Photo
}
if(asset.mediaSubtypes == PHAssetMediaSubtypeVideoTimelapse)
{
// this is a Time-lapse
}
for determining the generic file type of a PHAsset, you can check
asset.mediaType == PHAssetMediaTypeImage
asset.mediaType == PHAssetMediaTypeVideo
asset.mediaType == PHAssetMediaTypeAudio
//phAsset if object of phAsset
if let imageType = phAsset.value(forKey: "uniformTypeIdentifier") as? String {
if imageType == kUTTypeGIF as String {
//enter code here
}
}

How can I determine file size on disk of a video PHAsset in iOS8

I can request a video PHAsset using the Photos framework in iOS8. I'd like to know how big the file is on disk. There doesn't seem to be a property of PHAsset to determine that. Does anyone have a solution? (Using Photos framework not required)
Edit
As for iOS 9.3, using requestImageDataForAsset on a video type PHAsset will result in an image, which is the first frame of the video, so it doesn't work anymore. Use the following method instead, for normal video, request option can be nil, but for slow motion video, PHVideoRequestOptionsVersionOriginal needs to be set.
PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
AVURLAsset* urlAsset = (AVURLAsset*)asset;
NSNumber *size;
[urlAsset.URL getResourceValue:&size forKey:NSURLFileSizeKey error:nil];
NSLog(#"size is %f",[size floatValue]/(1024.0*1024.0)); //size is 43.703005
}
}];
//original answer
For PHAsset, use this:
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
float imageSize = imageData.length;
//convert to Megabytes
imageSize = imageSize/(1024*1024);
NSLog(#"%f",imageSize);
}];
For ALAsset:
ALAssetRepresentation *rep = [asset defaultRepresentation];
float imageSize = rep.size/(1024.0*1024.0);
I tested on one video asset, PHAsset shows the size as 43.703125, ALAsset shows the size as 43.703005.
Edit
For PHAsset, another way to get file size. But as #Alfie Hanssen mentioned, it works on normal video, for slow motion video, the following method will return a AVComposition asset in the block, so I added the check for its type. For slow motion video, use the requestImageDataForAsset method.
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:nil resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
AVURLAsset* urlAsset = (AVURLAsset*)asset;
NSNumber *size;
[urlAsset.URL getResourceValue:&size forKey:NSURLFileSizeKey error:nil];
NSLog(#"size is %f",[size floatValue]/(1024.0*1024.0)); //size is 43.703005
NSData *data = [NSData dataWithContentsOfURL:urlAsset.URL];
NSLog(#"length %f",[data length]/(1024.0*1024.0)); // data size is 43.703005
}
}];
Swift version with file size formatting:
let options = PHVideoRequestOptions()
options.version = .original
PHImageManager.default().requestAVAsset(forVideo: asset, options: options) { avAsset, _, _ in
if let urlAsset = avAsset as? AVURLAsset { // Could be AVComposition class
if let resourceValues = try? urlAsset.url.resourceValues(forKeys: [.fileSizeKey]),
let fileSize = resourceValues.fileSize {
let formatter = ByteCountFormatter()
formatter.countStyle = .file
let string = formatter.string(fromByteCount: Int64(fileSize))
print(string)
}
}
}
You heave pretty high chance, that video you want to know is's size is not type of AVURLAsset. But it's ok that under the hood there are more files that your video is composited of (for example raw samples, slow-mo time ranges, filters, etc...), because you want to know size of a concrete playable file. I'm not sure how estimated file size meets reality in this case, but this is how it should be done:
PHImageManager.defaultManager().requestExportSessionForVideo(asset, options: nil, exportPreset: AVAssetExportPresetHighestQuality, resultHandler: { (assetExportSession, info) -> Void in // Here you set values that specifies your video (original, after edit, slow-mo, ...) and that affects resulting size.
assetExportSession.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(asset.duration, 30)) // Time interval is default from zero to infinite so it needs to be set prior to file size computations. Time scale is I believe (it is "only" preferred value) not important in this case.
let HERE_YOU_ARE = assetExportSession.estimatedOutputFileLength
})

GIF animate too fast on Computer - iOS

I am trying to create GIF animated images, for which i pass an array of images.
Lets say I have a 4 seconds video and i extract about 120 frames from it. Regardless of the created GIF size, i create a GIF from all those 120 frames. The problem is, when i open the GIF in iPhone (by attaching it to MailViewComposer or iMessage) it runs fine, but if i email it, or import it to computer, it runs too fast. Can anyone suggest what is wrong here?
I am using HJImagesToGIF for GIF creation. The dictionary for GIF properties is as below:
NSDictionary *prep = [NSDictionary dictionaryWithObject:[NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:0.03f]
forKey:(NSString *) kCGImagePropertyGIFDelayTime]
forKey:(NSString *) kCGImagePropertyGIFDictionary];
NSDictionary *fileProperties = #{
(__bridge id)kCGImagePropertyGIFDictionary: #{
(__bridge id)kCGImagePropertyGIFLoopCount: #0, // 0 means loop forever
}
};
creation of GIF:
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:path];
CGImageDestinationRef dst = CGImageDestinationCreateWithURL(url, kUTTypeGIF, [images count], nil);
CGImageDestinationSetProperties(dst, (__bridge CFDictionaryRef)fileProperties);
for (int i=0;i<[images count];i++)
{
//load anImage from array
UIImage * anImage = [images objectAtIndex:i];
CGImageDestinationAddImage(dst, anImage.CGImage,(__bridge CFDictionaryRef)(prep));
}
bool fileSave = CGImageDestinationFinalize(dst);
CFRelease(dst);
if(fileSave) {
NSLog(#"animated GIF file created at %#", path);
}else{
NSLog(#"error: no animated GIF file created at %#", path);
}
To save the GIF, I’m using:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSData *data = [NSData dataWithContentsOfURL:[NSURL URLWithString:tempPath]];
data = [NSData dataWithContentsOfFile:tempPath];
[library writeImageDataToSavedPhotosAlbum:data metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"Error Saving GIF to Photo Album: %#", error);
} else {
// TODO: success handling
NSLog(#"GIF Saved to %#", assetURL);
}
}];
Thanks everyone.

Resources