Memory leaks when Retrieving photos from Photo Library in ios - ios

In my app, I want to Retrieve Photos and Videos from the Photo Library, and then save them into my app documents directory.
Following is my codes:
- (UIImage *)getImageFromAsset:(ALAsset *)asset type:(NSInteger)nType
{
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
CGImageRef imageReference = [assetRepresentation fullResolutionImage];
CGFloat imageScale = [assetRepresentation scale];
UIImageOrientation imageOrientation = (UIImageOrientation)[assetRepresentation orientation];
UIImage *iImage = [[UIImage alloc] initWithCGImage:imageReference scale:imageScale orientation:imageOrientation];
return iImage;
}
- (UIImage *)getImageAtIndex:(NSInteger)nIndex type:(NSInteger)nType
{
return [self getImageFromAsset:(ALAsset *)_assetPhotos[nIndex] type:nType];
}
......
for (NSIndexPath *index in _dSelected) {
DLog(#"the selected index is %#", index);
image = nil;
image = [ASSETHELPER getImageAtIndex:index.row type:ASSET_PHOTO_FULL_RESOLUTION];
NSString *name = [ASSETHELPER getImageNameAtIndex:index.row];
NSString *filepath = [files stringByAppendingPathComponent:name];
NSString *aliapath = [alias stringByAppendingPathComponent:name];
aliapath = [aliapath stringByAppendingString:THUMBNAIL];
DLog(#"the files is %# the alias is %#", filepath, aliapath);
image = nil;
}
If I retrieve just 20 or 30 photos, it would be ok, but if I retrieve too many photos(maybe 50 ones), the App will Terminate due to Memory Pressure. I think I have set the image to nil after every one image , so the ios system shoud get back the memory after each for loop. But why Memory leak happens?

ARC still needs to manage your memory. As long as you are in your for loop, ARC will never have the chnace to release your memory. You need to put the inside of your loop within an autorelase pool.
for (NSIndexPath *index in _dSelected) {
#autoreleasepool {
DLog(#"the selected index is %#", index);
image = nil;
image = [ASSETHELPER getImageAtIndex:index.row type:ASSET_PHOTO_FULL_RESOLUTION];
NSString *name = [ASSETHELPER getImageNameAtIndex:index.row];
NSString *filepath = [files stringByAppendingPathComponent:name];
NSString *aliapath = [alias stringByAppendingPathComponent:name];
aliapath = [aliapath stringByAppendingString:THUMBNAIL];
DLog(#"the files is %# the alias is %#", filepath, aliapath);
image = nil;
}
}
This will let your memory get freed as you deal with each image.

Related

storage greater than system

I want to simulate a shortage of storage, so I try to copy files util the storage is full, but I found it will never get to that, and I found the occuped storage is greater than system, how can this happen?
the code is as follows:
+ (void)copeFiles
{
NSString *srcPath = [self cloudStorageCacheFolderPath];
NSLog(#"copy: src path:%# ,size:%f", srcPath, [self sizeOfcloudStorageCacheFolder]);
int base = 300;
int i = base;
while (1) {
i++;
if (i > base + 100) {
break;
}
NSInteger freeSizeM = [self freeFileStorageSize];
if (freeSizeM > 1024) {
NSString *newFilePath = [NSString stringWithFormat:#"%#-%d", srcPath, i];
[self copyFolderContentFromSrc:srcPath toDestPath:newFilePath];
NSLog(#"copy: i:%d, freeSizeM:%li", i, (long)freeSizeM);
}else {
break;
}
}
}
+ (void)copyFolderContentFromSrc:(NSString *)srcPath toDestPath:(NSString *)dstPath
{
NSFileManager *fileManager = [NSFileManager defaultManager];
NSLog(#"copy: src:%# dst:%#", srcPath, dstPath);
BOOL isDir;
if (![fileManager fileExistsAtPath:dstPath isDirectory:&isDir] || !isDir) {
BOOL sucess = [fileManager createDirectoryAtPath:dstPath withIntermediateDirectories:YES attributes:nil error:nil];
NSLog(#"copy: isDir:%d sucess:%d", isDir, sucess);
}
NSArray* array = [fileManager contentsOfDirectoryAtPath:srcPath error:nil];
int i = 0;
for (NSString *fileName in array) {
i++;
NSString *srcFullPath = [srcPath stringByAppendingPathComponent:fileName];
NSString *toFullPath = [dstPath stringByAppendingPathComponent:fileName];
NSError *error;
BOOL isSucess = [fileManager copyItemAtPath:srcFullPath toPath:toFullPath error:&error];
NSLog(#"copy:%d", i);
if (!isSucess) {
NSLog(#"copy:error:%#", error);
}
}
}
image1
image2
The images may be invisible, so I describe the phenomenon simply in advance.My iphone is 32GB in total, but the available storage is 93GB showed in setting.
Yes, It will also increase if you run it again :)
The reason behind it is that It will store the reference only not an actual file. and I have done the same in my mac system to fill the space for simulator and I have ended up with same 256GB system showing 4.3 TB of space
In Device, you can fill space using record a video and still you can not get that error because Apple will manage it like this way.
1) Delete all temporary files that were stored in All App directory.
2) And then Clear cache memory.
Apple will manage the storage space management very well and there is still some mystery about it.

it happened a memory leaks when I getting image exif dictionary from a asset object using a static method,

Following are the method an methods call tree that cause the memory leak
//get the exif info of image asset background
#autoreleasepool {
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
self.dataSource = [NSMutableArray new];
__weak typeof(self) weakSelf = self;
dispatch_async(queue, ^{
for (int i = 0; i < self.selectImageList.count; i++) {
PFALAssetImageItemData *dataEntity = weakSelf.selectImageList[i];
//getting the object usering an image asset
ImageExifInfoEntity *imageExifEntity = [ImageExifInfoEntity getAlbumImageFromAsset:dataEntity.imageAsset imageOryder:i];
LOG(#"%#",imageExifEntity.description);
[weakSelf.dataSource addObject:imageExifEntity];
}
//back main thread update views
dispatch_async(dispatch_get_main_queue(), ^{
[self.collectionView reloadData];
[self hideHud];
});
});
}
In this code I want to creat ImageExifInfoEntity using a static method with an asset in a thread:
[ImageExifInfoEntity getAlbumImageFromAsset:dataEntity.imageAsset imageOryder:i];
In this method,it create an new object of ImageExifInfoEntity type,and get the exif Dictionary using a static method
+(ImageExifInfoEntity *)getAlbumImageFromAsset:(ALAsset *)asset category:(NSString *)category imageOryder:(NSInteger)imageOrder{
ImageExifInfoEntity *albumImage = [ImageExifInfoEntity new];
..........
albumImage.imageSize = [UIImage imageSizeWithAlasset:asset];
albumImage.exifDic = [ImageExifInfoEntity getExifInfoFromAsset:asset] == nil ? #{}:[ImageExifInfoEntity getExifInfoFromAsset:asset];
..........
}
Finally,I get an exif dictionary using this method where the memory leak happed
+(NSDictionary *)getExifInfoFromAsset:(ALAsset *)asset {
NSDictionary *_imageProperty;
__weak ALAsset *tempAsset = asset;
ALAssetRepresentation *representation = tempAsset.defaultRepresentation;
uint8_t *buffer = (uint8_t *)malloc(representation.size);
NSError *error;
NSUInteger length = [representation getBytes:buffer fromOffset:0 length:representation.size error:&error];
NSData *data = [NSData dataWithBytes:buffer length:length];
CGImageSourceRef cImageSource = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(cImageSource, 0, NULL);
_imageProperty = (__bridge_transfer NSDictionary*)imageProperties;
free(buffer);
NSLog(#"image property: %#", _imageProperty);
return _imageProperty;
}
here is the instrument analyze result
call tree
the final method that cause memory leaks
CGImageSourceCreateWithData() is a creation function.
CGImageSourceRef cImageSource = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
According to the CF memory rules, you have to release it with CFRelease().
replace __bridge with __bridge_transfer so that ARC will be in charge of freeing the memory
I think you should call CFRelease(cImageSource); to release the image source. See document: https://developer.apple.com/library/ios/documentation/GraphicsImaging/Reference/CGImageSource/#//apple_ref/c/func/CGImageSourceCreateWithData
An image source. You are responsible for releasing this object using CFRelease.

iOS - Get File Size of Image Selected From Photo Reel

I'm building an app that allows users to select photos and videos from their device and upload them to the server. Trying to get the file size (in bytes) of each item select, can anyone help me out?
if ([dict objectForKey:UIImagePickerControllerMediaType] == ALAssetTypePhoto){ // image file
if ([dict objectForKey:UIImagePickerControllerOriginalImage]){
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 1; // image
item.itemSize = // what do I need here??
[m_items addObject:item];
}
} else if ([dict objectForKey:UIImagePickerControllerMediaType] == ALAssetTypeVideo){ // video file
if ([dict objectForKey:UIImagePickerControllerOriginalImage]){
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 2; // video
item.itemSize = // what do I need here??
[m_items addObject:item];
}
}
EDIT
Getting NSCocaoErrorDomain 256 with videos:
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 2; // video
//Error Container
NSError *attributesError;
NSDictionary *fileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[urlPath path] error:&attributesError];
NSNumber *fileSizeNumber = [fileAttributes objectForKey:NSFileSize];
long fileSize = [fileSizeNumber longValue];
item.itemSize = fileSize;
[m_items addObject:item];
For only image data selection:
item.itemImage = (UIImage*)[info valueForKey:UIImagePickerControllerOriginalImage];
NSData *imgData = UIImageJPEGRepresentation(item.itemImage, 1); //1 it represents the quality of the image.
NSLog(#"Size of Image(bytes):%d",[imgData length]);
Hope this will help you.
Below method is generalize, it will work for both image and video:
Something like this should take care finding the file size of a selected image or video returned from the UIImagePickerController
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSURL *videoUrl=(NSURL*)[info objectForKey:UIImagePickerControllerMediaURL];
//Error Container
NSError *attributesError;
NSDictionary *fileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[videoUrl path] error:&attributesError];
NSNumber *fileSizeNumber = [fileAttributes objectForKey:NSFileSize];
long long fileSize = [fileSizeNumber longLongValue];
}

Calling imageWithData:UIImageJPEGRepresentation() multiple times only compresses image the first time

In order to prevent lagging in my app, I'm trying to compress images larger than 1 MB (mostly for pics taken from iphone's normal camera.
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSData *imageSize = UIImageJPEGRepresentation(image, 1);
NSLog(#"original size %u", [imageSize length]);
UIImage *image2 = [UIImage imageWithData:UIImageJPEGRepresentation(image, 0)];
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1);
NSLog(#"new size %u", [newImageSize length]);
UIImage *image3 = [UIImage imageWithData:UIImageJPEGRepresentation(image2, 0)];
NSData *newImageSize2 = UIImageJPEGRepresentation(image3, 1);
NSLog(#"new size %u", [newImageSize2 length]);
picView = [[UIImageView alloc] initWithImage:image3] ;
However, the NSLog I get outputs something along the lines of
original size 3649058
new size 1835251
new size 1834884
The difference between the 1st and 2nd compression is almost negligible. My goal is to get the image size below 1 MB. Did I overlook something/is there an alternative approach to achieve this?
EDIT: I want to avoid scaling the image's height and width, if possible.
A couple of thoughts:
The UIImageJPEGRepresentation function does not return the "original" image. For example, if you employ a compressionQuality of 1.0, it does not, technically, return the "original" image, but rather it returns a JPEG rendition of the image with compressionQuality at its maximum value. This can actually yield an object that is larger than the original asset (at least if the original image is a JPEG). You're also discarding all of the metadata (information about where the image was taken, the camera settings, etc.) in the process.
If you want the original asset, you should use PHImageManager:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = [result firstObject];
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString *filename = [(NSURL *)info[#"PHImageFileURLKey"] lastPathComponent];
// do what you want with the `imageData`
}];
In iOS versions prior to 8, you'd have to use assetForURL of the ALAssetsLibrary class:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSLog(#"size of original asset %llu", [representation size]);
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data strea
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
NSUInteger bytesRead;
while (bytesRemaining > 0) {
bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(#"error reading asset representation: %#", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(#"error=%#", error);
}];
Please note that this assetForURL runs asynchronously.
If you want a NSData with compression, you can use UIImageJPEGRepresentation with a compressionQuality less than 1.0. Your code actually does this with a compressionQuality of 0.0, which should offer maximum compression. But you don't save that NSData, but rather use it to create a UIImage and you then get a new UIImageJPEGRepresentation with a compressionQuality of 1.0, thus losing much of the compression you originally achieved.
Consider the following code:
// a UIImage of the original asset (discarding meta data)
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// this may well be larger than the original asset
NSData *jpgDataHighestCompressionQuality = UIImageJPEGRepresentation(image, 1.0);
[jpgDataHighestCompressionQuality writeToFile:[docsPath stringByAppendingPathComponent:#"imageDataFromJpeg.jpg"] atomically:YES];
NSLog(#"compressionQuality = 1.0; length = %u", [jpgDataHighestCompressionQuality length]);
// this will be smaller, but with some loss of data
NSData *jpgDataLowestCompressionQuality = UIImageJPEGRepresentation(image, 0.0);
NSLog(#"compressionQuality = 0.0; length = %u", [jpgDataLowestCompressionQuality length]);
UIImage *image2 = [UIImage imageWithData:jpgDataLowestCompressionQuality];
// ironically, this will be larger than jpgDataLowestCompressionQuality
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1.0);
NSLog(#"new size %u", [newImageSize length]);
In addition to the JPEG compression quality outlined the prior point, you could also just resize the image. You can also marry this with the JPEG compressionQuality, too.
You can not compress the image again and again. If so everything can be compressed again and again. Then how small do you think it will be?
One way to make your image smaller is to change it's size. For example change 640X960 to 320X480. But you will lose quality.
I is the first implementation of UIImageJPEGRepresentation (image, 0.75), and then change the size. Maybe image's width and heigh two-thirds or half.

How to load images from an NSURL with the imageNamed behaviour (#2x, ~ipad etc. variants)

[UIImage imageNamed:] will do lots of clever things when loading files from the bundle, like caching to prevent multiple UIImage instances of the same image, looking for #2x and ~ipad suffixes, and setting the scale property correctly. I want to be able to do the same thing when loading images from the documents directory (specified with a NSURL). I've looked around but can't find anything in the docs for this, is there something I've missed?
I'm currently implementing this myself (the whole shebang, with caching, etc.) but I hate to duplicate framework code. I hope to get an answer before I'm done but if not I'll post the code.
This is the best thing I've come up with. It's not ideal as it's duplicating behaviour in the framework (perhaps with subtle inconsistencies), but it does the nice things we want from imageNamed:.
+ (UIImage*)imageNamed:(NSString*)name relativeToURL:(NSURL*)rootURL
{
// Make sure the URL is a file URL
if(![rootURL isFileURL])
{
NSString* reason = [NSString stringWithFormat:#"%# only supports file URLs at this time.", NSStringFromSelector(_cmd)];
#throw [NSException exceptionWithName:NSInvalidArgumentException reason:reason userInfo:nil];
}
// Check the cache first, using the raw url/name as the key
NSCache* cache = objc_getAssociatedObject([UIApplication sharedApplication].delegate, #"imageCache");
// If cache doesn't exist image will be nil - cache is created later only if everything else goes ok
NSURL* cacheKey = [rootURL URLByAppendingPathComponent:name];
UIImage* image = [cache objectForKey:cacheKey];
if(image != nil)
{
// Return the cached image
return image;
}
// Various suffixes to try in preference order
NSString* scaleSuffix[] =
{
#"#2x",
#""
};
CGFloat scaleValues[] =
{
2.0f,
1.0f
};
NSString* deviceSuffix[] =
{
([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPad) ? #"~ipad" : #"~iphone",
#""
};
NSString* formatSuffix[] =
{
#"png"
};
NSURL* imageURL = nil;
CGFloat imageScale = 0.0f;
// Iterate through scale suffixes...
NSInteger ss, ssStart, ssEnd, ssInc;
if([UIScreen mainScreen].scale == 2.0f)
{
// ...forwards
ssStart = 0;
ssInc = 1;
}
else
{
// ...backwards
ssStart = (sizeof(scaleSuffix) / sizeof(NSString*)) - 1;
ssInc = -1;
}
ssEnd = ssStart + (ssInc * (sizeof(scaleSuffix) / sizeof(NSString*)));
for(ss = ssStart; (imageURL == nil) && (ss != ssEnd); ss += ssInc)
{
// Iterate through devices suffixes
NSInteger ds;
for(ds = 0; (imageURL == nil) && (ds < (sizeof(deviceSuffix) / sizeof(NSString*))); ds++)
{
// Iterate through format suffixes
NSInteger fs;
for(fs = 0; fs < (sizeof(formatSuffix) / sizeof(NSString*)); fs++)
{
// Add all of the suffixes to the URL and test if it exists
NSString* nameXX = [name stringByAppendingFormat:#"%#%#.%#", scaleSuffix[ss], deviceSuffix[ds], formatSuffix[fs]];
NSURL* testURL = [rootURL URLByAppendingPathComponent:nameXX];
NSLog(#"testing if image exists: %#", testURL);
if([testURL checkResourceIsReachableAndReturnError:nil])
{
imageURL = testURL;
imageScale = scaleValues[ss];
break;
}
}
}
}
// If a suitable file was found...
if(imageURL != nil)
{
// ...load and cache the image
image = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:imageURL]];
image = [UIImage imageWithCGImage:image.CGImage scale:imageScale orientation:UIImageOrientationUp];
NSLog(#"Image loaded, with scale: %f", image.scale);
if(cache == nil)
{
cache = [NSCache new];
objc_setAssociatedObject([UIApplication sharedApplication].delegate, #"imageCache", cache, OBJC_ASSOCIATION_RETAIN);
}
[cache setObject:image forKey:cacheKey];
}
return image;
}
Please let me know if you find any problems. As far as I know the semantics are like imageNamed: - at least for the most common cases. Maybe there are a load of different image formats and some other modifiers I don't know about - the code should be fairly easy to modify to support that.
I think this should do your trick, it's a simple test to check the scale of screen.
UIImage *image;
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2){
// #2x
NSURL *imageURL = [NSURL URLWithString:#"http://www.example.com/images/yourImage#2x.png"];
NSData * imageData = [NSData dataWithContentsOfURL:imageURL];
image = [UIImage imageWithData:imageData];
} else {
// #1x
NSURL *imageURL = [NSURL URLWithString:#"http://www.example.com/images/yourImage.png"];
NSData * imageData = [NSData dataWithContentsOfURL:imageURL];
image = [UIImage imageWithData:imageData];
}
UIImageView *yourImageView = [[UIImageView alloc] initWithImage:image];
it has been answered here
How should retina/normal images be handled when loading from URL?
hope it helps

Resources