App crashes with memory Pressure every time in IOS? - ios

I am generating image from view using following functions
-(void)preparePhotos
{
[assetGroup enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop)
{
if(result == nil)
{
return;
}
NSMutableDictionary *workingDictionary = [[NSMutableDictionary alloc] init];
UIImage *img = [[UIImage imageWithCGImage:[[result defaultRepresentation] fullResolutionImage]] resizedImageToSize:CGSizeMake(600, 600)];
[workingDictionary setObject:img forKey:#"UIImagePickerControllerOriginalImage"];
[appdelegate.arrImageData addObject:workingDictionary];
}];
}
But as the number of times this function is called is get increased , app get crashed. How can I optimize this function or any alternative function to get image from device gallery which will not result into crash.function call like this.
[self performSelectorInBackground:#selector(preparePhotos) withObject:nil];

If you want to fetch all the images from photo library than you should only store their Assets urls only not the images themselves. Lets say you are storing the Assets url in an array named photoAssets than you can call this method by passing just the index:
- (UIImage *)photoAtIndex:(NSUInteger)index
{
ALAsset *photoAsset = self.photoAssets[index];
ALAssetRepresentation *assetRepresentation = [photoAsset defaultRepresentation];
UIImage *fullScreenImage = [UIImage imageWithCGImage:[assetRepresentation fullScreenImage]
scale:[assetRepresentation scale]
orientation:UIImageOrientationUp];
return fullScreenImage;
}
and for more information or reference you should refer PhotoScroller and MyImagePicker

Related

Memory issue while fetching images from Photos library with metadata

I'm trying to get all photos from photos library with image's metadata. It works fine for 10-20 images but when there are 50+ images it occupies too much memory, which causes to app crash.
Why i need all images into array?
Answer - to send images to server app. [i'm using GCDAsyncSocket to send data on receiver socket/port and i don't have that much waiting time to request images from PHAsset while sending images on socket/port.
My Code :
+(void)getPhotosDataFromCamera:(void(^)(NSMutableArray *arrImageData))completionHandler
{
[PhotosManager checkPhotosPermission:^(bool granted)
{
if (granted)
{
NSMutableArray *arrImageData = [NSMutableArray new];
NSArray *arrImages=[[NSArray alloc] init];
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
arrImages = [result copy];
//--- If no images.
if (arrImages.count <= 0)
{
completionHandler(nil);
return ;
}
__block int index = 1;
__block BOOL isDone = false;
for (PHAsset *asset in arrImages)
{
[PhotosManager requestMetadata:asset withCompletionBlock:^(UIImage *image, NSDictionary *metadata)
{
#autoreleasepool
{
NSData *imageData = metadata?[PhotosManager addExif:image metaData:metadata]:UIImageJPEGRepresentation(image, 1.0f);
if (imageData != nil)
{
[arrImageData addObject:imageData];
NSLog(#"Adding images :%i",index);
//--- Done adding all images.
if (index == arrImages.count)
{
isDone = true;
NSLog(#"Done adding all images with info!!");
completionHandler(arrImageData);
}
index++;
}
}
}];
}
}
else
{
completionHandler(nil);
}
}];
}
typedef void (^PHAssetMetadataBlock)(UIImage *image,NSDictionary *metadata);
+(void)requestMetadata:(PHAsset *)asset withCompletionBlock:(PHAssetMetadataBlock)completionBlock
{
PHContentEditingInputRequestOptions *editOptions = [[PHContentEditingInputRequestOptions alloc]init];
editOptions.networkAccessAllowed = YES;
[asset requestContentEditingInputWithOptions:editOptions completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info)
{
CIImage *CGimage = [CIImage imageWithContentsOfURL:contentEditingInput.fullSizeImageURL];
UIImage *image = contentEditingInput.displaySizeImage;
dispatch_async(dispatch_get_main_queue(), ^{
completionBlock(image,CGimage.properties);
});
CGimage = nil;
image = nil;
}];
editOptions = nil;
asset =nil;
}
+ (NSData *)addExif:(UIImage*)toImage metaData:(NSDictionary *)container
{
NSData *imageData = UIImageJPEGRepresentation(toImage, 1.0f);
// create an imagesourceref
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);
// this is the type of image (e.g., public.jpeg)
CFStringRef UTI = CGImageSourceGetType(source);
// create a new data object and write the new image into it
NSMutableData *dest_data = [[NSMutableData alloc] initWithLength:imageData.length+2000];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data, UTI, 1, NULL);
if (!destination) {
NSLog(#"Error: Could not create image destination");
}
// add the image contained in the image source to the destination, overidding the old metadata with our modified metadata
CGImageDestinationAddImageFromSource(destination, source, 0, (__bridge CFDictionaryRef) container);
BOOL success = NO;
success = CGImageDestinationFinalize(destination);
if (!success) {
NSLog(#"Error: Could not create data from image destination");
}
CFRelease(destination);
CFRelease(source);
imageData = nil;
source = nil;
destination = nil;
return dest_data;
}
Well it's not a surprise that you arrive into this situation, since each of your image consumes memory and you instantiate and keep them in memory. This is not really a correct design approach.
In the end it depends on what you want to do with those images.
What I would suggest is that you keep just the array of your PHAsset objects and request the image only on demand.
Like if you want to represent those images into a tableView/collectionView, perform the call to
[PhotosManager requestMetadata:asset withCompletionBlock:^(UIImage *image, NSDictionary *metadata)
directly in the particular method. This way you won't drain the device memory.
There simply is not enough memory on the phone to load all of the images into the photo library into memory at the same time.
If you want to display the images, then only fetch the images that you need for immediate display. For the rest keep just he PHAsset. Make sure to discard the images when you don't need them any more.
If you need thumbnails, then fetch only the thumbnails that you need.
If want to do something with all of the images - like add a watermark to them or process them in some way - then process each image one at a time in a queue.
I cannot advise further as your question doesn't state why you need all of the images.

How to get image from asset library iOS?

I am iOS developer i want to get all images from library, without UIImagepickercontroller,and take first 10 images
Any ideas?
There are lots of example out der which will guide you how to get images from ALAssetLibrary
https://www.cocoacontrols.com/search?q=image+picker.
Below is example to get the latest image from ImagePicker
- (void)latestPhotoWithCompletion:(void (^)(UIImage *photo))completion
{
ALAssetsLibrary *library=[[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just photos.
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
// For this example, we're only interested in the last item [group numberOfAssets]-1 = last.
if ([group numberOfAssets] > 0) {
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:[group numberOfAssets]-1] options:0
usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
// Do something interesting with the AV asset.
UIImage *img = [UIImage imageWithCGImage:[representation fullScreenImage]];
// completion
completion(img);
// we only need the first (most recent) photo -- stop the enumeration
*innerStop = YES;
}
}];
}
} failureBlock: ^(NSError *error) {
// Typically you should handle an error more gracefully than this.
}];
}
Usage
__weak __typeof(self)wSelf = self;
[self latestPhotoWithCompletion:^(UIImage *photo) {
UIImageRenderingMode renderingMode = YES ? UIImageRenderingModeAlwaysOriginal : UIImageRenderingModeAlwaysTemplate;
[wSelf.switchCameraBut setImage:[photo imageWithRenderingMode:renderingMode] forState:UIControlStateNormal];
}];

ALAssetsLibrary, ALAsset and Memory Warning for 3 Images Display On Screen

My app retrieves the last 100 photos with ALAssetsLibrary and ALAsset. I apply lazy load technique to avoid loading more than 3 images on screen. This is the flow:
loop though assets with ALAssetsLibrary to get last 100
store each ALAsset in NSMutableArray:
[self.pageImages addObject:asset];
load maximum 3 images on screen: one image to view now, one image for next and one for previous pages
scrolling to the next image removes previous and adds one new image to the view
BUT I still receive memory warning, and especially if swiping/scrolling is too fast.
What additional suggestions or approaches to use dealing with ALAsset? Here is the code:
ALAsset * asset = [self.pageImages objectAtIndex:page];
NSLog(#"asset %#",asset);
ALAssetRepresentation *arep = [asset defaultRepresentation];
// Retrieve the image orientation from the ALAsset
UIImageOrientation orientation = UIImageOrientationUp;
NSNumber* orientationValue = [asset valueForProperty:#"ALAssetPropertyOrientation"];
if (orientationValue != nil) {
orientation = [orientationValue intValue];
}
CGFloat scale = 1;
UIImage *img = [UIImage imageWithCGImage:[arep fullResolutionImage]
scale:scale orientation:orientation];

Image loading with GCD receiving memory warning

I'm developing a photo gallery application using AssetsLibrary to load my device photos. When presenting a random image in another VC I've noticed the following : it takes about 1 or 2 seconds for my full res image to load on the imageView (way much longer than the native photosApp) and I also get from the log "Received memory warning" after loading a few images. If I set my representation to fullScreenImage the warnings stop but I don't want this. What must I change for a smooth performance and high quality images on the view ?
Here's the code,hope you can tell me what's the problem :
This is the VC where I want to present my image on the screen
- (void)viewDidLoad
{
[super viewDidLoad];
NSLog(#"%#",assetsController);
detailImageView = [[UIImageView alloc]initWithFrame:self.view.bounds];
[self.view addSubview:detailImageView];
detailImageView.image = smallImage; //small image is my asset thumbnail and is passed as an argument in my init function
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW, 0), ^{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
ALAsset *asset = [assetsController.albumPics objectAtIndex:assetsController.index];
ALAssetRepresentation *representation = [asset defaultRepresentation];
bigImage = [[UIImage imageWithCGImage:[representation fullResolutionImage]]retain];
dispatch_async(dispatch_get_main_queue(), ^{
detailImageView.image = bigImage;
});
[pool release];
});
}
UPDATE 1
{
UIImageView *detailImageView = [[UIImageView alloc]initWithFrame:self.view.bounds];
[self.view addSubview:detailImageView];
detailImageView.image = smallImage;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW, 0), ^{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
ALAsset *asset = [assetsController.albumPics objectAtIndex:assetsController.index];
ALAssetRepresentation *representation = [asset defaultRepresentation];
UIImage *bigImage = [UIImage imageWithCGImage:[representation fullResolutionImage]];
dispatch_async(dispatch_get_main_queue(), ^{
detailImageView.image = bigImage;
});
[pool release];
});
}
Is bigImage an instance variable? Is it used in any place other than here? If it is not used anywhere else, then it should be a local variable, and you shouldn't retain it. If it is an instance variable that you retain, you need to release the previous value before assigning a new value to it.
Same discussion applies to detailImageView

Getting a URL from (to) a "picked" image, iOS

With this code below, I can extract metadata from an image (pre-added to my project), and render the info as text. This is exactly what I want to do. The SYMetadata is created by pointing to an image via URL. initWithAbsolutePathURL. I want to do the same thing with a UIImage or maybe the image that is being loaded to the UIImage. How do I get the URL to the image that the picker selects? Or how do I create an "asset" from this incoming image?
The documentation describes initWithAsset. Have not figured out how to use it yet though, or if this is the right way to go for my purpose. Any help greatly appreciated.
NSURL *imageURL = [[NSBundle mainBundle] URLForResource:#"someImage" withExtension:#"jpg"];
SYMetadata *metadata = [[SYMetadata alloc] initWithAbsolutePathURL:imageURL];
[textView setText:[[metadata allMetadatas] description]];
Note: I tried adding an NSURL like this imageURL = [info valueForKey:#"UIImagePickerControllerReferenceURL"];, in the "pickerDidFinish" method but the metadata is null after I add this URL to the above code.
If you are using the imagePickerController, the delegate method will give you what you need
- (void) imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
if ([[info allKeys] containsObject:UIImagePickerControllerReferenceURL]){
// you will get this key if your image comes from a library
[self setMetaDataFromAssetLibrary:info];
} else if ([[info allKeys] containsObject:UIImagePickerControllerMediaMetadata]){
// if the image comes from the camera you get the metadata in it's own key
self.rawMetaData = [self metaDataFromCamera:info];
}
}
From Asset Library - bear in mind that it takes time to complete and has an asynchronous completion block, so you might want to add a completion flag to ensure you don't access the property before it has been updated.
- (void) setMetaDataFromAssetLibrary:(NSDictionary*)info
{
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
self.rawMetaData = asset.defaultRepresentation.metadata;
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];
}
From Camera:
- (NSDictionary*)metaDataFromCamera:(NSDictionary*)info
{
NSMutableDictionary *imageMetadata = [info objectForKey:UIImagePickerControllerMediaMetadata];
return imageMetadata;
}
Here is how to get metadata from a UIImage
- (NSDictionary*)metaDataFromImage:(UIImage*)image
{
NSData *jpegData = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0)];
return [self metaDataFromData:jpegData];
}
But take care - a UIImage can already stripped of much of the metadata from the original.. you will be better off getting the metadata from the NSData that was used to create the UIImage...
- (NSDictionary*)metaDataFromData:(NSData*)data
{
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
CFDictionaryRef imageMetaData = CGImageSourceCopyPropertiesAtIndex(source,0,NULL);
return (__bridge NSDictionary *)(imageMetaData);
}
If you've an ALAsset (in my sample _detailItem), you can have metadata in this way:
NSDictionary *myMetadata = [[_detailItem defaultRepresentation] metadata];

Resources