App is terminated due to memory pressure while taking multiple pictures - ios

I have tried a lot of solutions provided on other questions which were same as mine but nothing could help me much.
Let me tell you what am I doing. I have a collection view. In that I will display some images which will be captured by camera. I am capturing multiple pictures at a time. All the pictures which I have taken, the address of those images will first save into database and then those images will be displayed in collection view.
Now what happens, when I click 40-50 images at a time, the app is crashed and xcode displays a message something like "app is terminating due to memory pressure". Also I am getting too many memory warnings in logs but actually I was neglecting them.
First I am writing code for taking multiple pictures-
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
//Get Image URL from Library
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
NSURL *urlPath = [info valueForKey:UIImagePickerControllerReferenceURL];
if (segmentControl.selectedSegmentIndex != 1) {
[picker dismissViewControllerAnimated:YES completion:nil];
}
if (segmentControl.selectedSegmentIndex == 2) {
[self insertPicToDB:urlPath];
}else{
__block NSURL *url;
if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Request to save the image to camera roll
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
} else {
url = assetURL;
[self insertPicToDB:url];
}
}];
}
}
}
}
After while taking each picture, I am saving image url in db and then at the same time trying to reload the collection view as well.
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath{
static NSString *identifier = #"Cell";
collectionCell = (CollectionCell *)[_collectionView dequeueReusableCellWithReuseIdentifier:identifier forIndexPath:indexPath];
collectionCell.imageView = (UIImageView*)[collectionCell viewWithTag:100];
collectionCell.imageView.image = [UIImage imageNamed:#"placeholder.png"];
NSString *fileURL = [[recipeImages[indexPath.section] objectAtIndex:indexPath.item] objectForKey:#"FileUrl"];
collectionCell.imagURL = fileURL;
if ([fileURL hasPrefix:#"assets-library"]) {
[self getImageFromURL:[NSURL URLWithString:fileURL] :indexPath];
}else{
fileURL = [NSString stringWithFormat:#"%#/uploads/thumbnail/%#",[[HttpClient sharedInstance]getBaseURLString],[[fileURL componentsSeparatedByString:#"\\"]lastObject]];
[collectionCell.imageView setImageWithURL:[NSURL URLWithString:fileURL] placeholderImage:[UIImage imageNamed:#"placeholder.png"]];
}
return collectionCell;
}
So the condition is, I will keep clicking the pictures and the pictures will be saving in background.
And the definition of method "getImageFromURL" is-
-(void)getImageFromURL:(NSURL*)yourUrl :(NSIndexPath*)indexPath{
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
#autoreleasepool {
CGImageRef iref = [rep fullScreenImage];
if (iref) {
UIImage *image = [UIImage imageWithCGImage:iref];
dispatch_async(dispatch_get_main_queue(), ^{
collectionCell = (CollectionCell*)[_collectionView cellForItemAtIndexPath:indexPath];
if (collectionCell) {
NSData *imageData = UIImageJPEGRepresentation(image, 0.1);
UIImage *compressedImage = [UIImage imageWithData:imageData];
collectionCell.imageView.image = compressedImage;
}else{
collectionCell.imageView.image = nil;
}
[collectionCell.imageView setNeedsDisplay];
[collectionCell setNeedsDisplay];
});
iref = nil;
}
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Can't get image - %#",[myerror localizedDescription]);
};
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:yourUrl
resultBlock:resultblock
failureBlock:failureblock];
}
I am also trying to compress the images while fetching in collection view. So I don't think that it is crashing because of collection view. What can be the reason? Is it because of I am using ALAssetsLibrary or something else?
I was debugging it in iPhone 4S with iOS version 7.1.1.
Thanks in advance.

Related

Image Metadata from Action Extension

Usually when i want to read some image metadata in iOS i use imagePickerController to choose the image and Photos Framework and imagePickerController:didFinishPickingMediaWithInfo:
to get image info and extract the metadata like this
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary<NSString *,id> *)info{
UIImagePickerControllerSourceType pickerType = picker.sourceType;
if(pickerType == UIImagePickerControllerSourceTypePhotoLibrary)
{
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithALAssetURLs:#[url,] options:nil];
PHAsset *asset = fetchResult.firstObject;
[self metaDataFromPhotoLibrary:asset];
[self dismissViewControllerAnimated:YES completion:NULL];
}
}
-(void)metaDataFromPhotoLibrary:(PHAsset*)asset{
// NSLog(#"Start metadata");
PHContentEditingInputRequestOptions *options = [[PHContentEditingInputRequestOptions alloc] init];
options.networkAccessAllowed = YES; //download asset metadata from iCloud if needed
[asset requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
CIImage *fullImage = [CIImage imageWithContentsOfURL:contentEditingInput.fullSizeImageURL];
NSDictionary *metadata = fullImage.properties;
NSMutableDictionary *imageMetadata = nil;
imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:metadata];
NSString *dateString = metadata [#"{TIFF}"] [#"DateTime"];
NSString *latitude = metadata [#"{GPS}"][#"Latitude"];
NSString *longitude = metadata [#"{GPS}"][#"Longitude"];
//etc etc etc
}
But can't do the same thing from an Action Extension
in the extension code i use a code like this to get the selected image
- (void)viewDidLoad {
[super viewDidLoad];
// Get the item[s] we're handling from the extension context.
// For example, look for an image and place it into an image view.
// Replace this with something appropriate for the type[s] your extension supports.
BOOL imageFound = NO;
for (NSExtensionItem *item in self.extensionContext.inputItems) {
for (NSItemProvider *itemProvider in item.attachments) {
if ([itemProvider hasItemConformingToTypeIdentifier:(NSString *)kUTTypeImage]) {
// This is an image. We'll load it, then place it in our image view.
__weak UIImageView *immagine = self.immagine;
[itemProvider loadItemForTypeIdentifier:(NSString *)kUTTypeImage options:nil completionHandler:^(UIImage *image, NSError *error) {
if(image) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
[immagine setImage:image];
}];
}
}];
imageFound = YES;
break;
}
}
if (imageFound) {
// We only handle one image, so stop looking for more.
break;
}
}
}
Using UIImagePNGRepresentation or UIImageJPEGRepresentation i lost many metadata and i can read only a few data !
How can i get all image metadata from the image selected from action Extension ?
Thank you so mush
Note: i 've found some app in the AppsStore that read all metadata dictionary from extension , so there must be a solution for my problem !
Thank you again
Vanni

Caching with UIImage and downloaded images

I have a class method which fetches images with a completion block. This fetched UIImage is added to an NSCache with a relevant key. This seems to work as expected, however, in the method which fetches images I am using a UIImage's imageWithData: method, which I have discovered does not cache it's data, only imageNamed: does.
I am understandably getting memory warnings because of this, how do I make sure the images loaded with UIImage's imageWithData: method are removed from memory when not needed anymore?
EDIT
Here is the code for the method which downloads the images.
- (void)imageForFootageSize:(FootageSize)footageSize withCompletionHandler:(void (^)(UIImage *image))completionBlock
{
if (completionBlock) {
__block UIImage *image;
// Try getting local image from disk.
//
__block NSURL *imageURL = [self localURLForFootageSize:footageSize];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
image = [UIImage imageWithData:[NSData dataWithContentsOfURL:imageURL]];
dispatch_async(dispatch_get_main_queue(), ^{
if (image) {
completionBlock(image);
} else {
//
// Otherwise try getting remote image.
//
imageURL = [self remoteURLForFootageSize:footageSize];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSData *imageData = [NSData dataWithContentsOfURL:imageURL];
dispatch_async(dispatch_get_main_queue(), ^{
image = [UIImage imageWithData:imageData];
if (image) {
//
// Save remote image to disk
//
NSURL *photoDirectoryURL = [Footage localURLForDirectory];
// Create the folder(s) where the photos are stored.
//
[[NSFileManager defaultManager] createDirectoryAtPath:[photoDirectoryURL path] withIntermediateDirectories:YES attributes:nil error:nil];
// Save photo
//
NSString *localPath = [[self localURLForFootageSize:footageSize] path];
[imageData writeToFile:localPath atomically:YES];
}
completionBlock(image);
});
});
}
});
});
}
}
EDIT 2
Methods which use the above class method to fetch and process the UIImage in the completionHandler.
Method inside UICollectionViewCell subclass.
- (void)setPhoto:(Photo *)photo withImage:(UIImage *)image
{
[self setBackgroundColor:[UIColor blackColor]];
[self.imageView setBackgroundColor:[UIColor clearColor]];
if (photo && !image) {
[photo imageForFootageSize:[Footage footageSizeThatBestFitsRect:self.bounds]
withCompletionHandler:^(UIImage *image) {
if ([self.delegate respondsToSelector:#selector(galleryPhotoCollectionViewCell:didLoadImage:)]) {
[self.delegate galleryPhotoCollectionViewCell:self didLoadImage:image];
}
image = nil;
}];
}
[self.imageView setImage:image];
BOOL isPhotoAvailable = (BOOL)(image);
[self.imageView setHidden:!isPhotoAvailable];
[self.activityIndicatorView setHidden:isPhotoAvailable];
}
Method in UICollectionView data source delegate
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath
{
DIGalleryPhotoCollectionViewCell *photoCell = [collectionView dequeueReusableCellWithReuseIdentifier:photoCellIdentifier forIndexPath:indexPath];
[photoCell setDelegate:self];
Footage *footage = [self footageForIndexPath:indexPath];
Photo *photo = ([footage isKindOfClass:[Photo class]]) ? (Photo *)footage : nil;
if (photo) {
//
// Photo
//
[photoCell setPhoto:photo withImage:[self.galleryCache objectForKey:photo.footageID]];
}
return photoCell;
}
Here are the other relevant methods:
- (void)galleryPhotoCollectionViewCell:(DIGalleryPhotoCollectionViewCell *)cell didLoadImage:(UIImage *)image
{
NSIndexPath *indexPath = [self.galleryCollectionView indexPathForCell:cell];
Footage *footage = [self footageForIndexPath:indexPath];
if ([footage isKindOfClass:[Footage class]]) {
Photo *photo = (Photo *)footage;
UIImage *cachedImage = [self.galleryCache objectForKey:photo.footageID];
if (!cachedImage) {
cachedImage = image;
[self.galleryCache setObject:image forKey:photo.footageID];
}
[cell setPhoto:photo withImage:image];
}
}
And also my getter method for the NSCache property galleryCache
- (NSCache *)galleryCache
{
if (!_galleryCache) {
_galleryCache = [[NSCache alloc] init];
}
return _galleryCache;
}
Instead of rolling your own image downloading and caching solution you might be better off using SDWebImage. Then you don't have to worry about the downloading, caching or anything. SDWebImage also using disk caching so you don't have to worry about freeing memory.
SDWebImageManager *manager = [SDWebImageManager sharedManager];
[manager downloadWithURL:imageURL options:0 progress:^(NSInteger receivedSize, NSInteger expectedSize)
{
// progression tracking code
} completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished)
{
if (image)
{
// do something with image
}
}];
I'm not sure but you also might have a retain cycle:
__weak typeof(self) weakSelf = self;
[photo imageForFootageSize:[Footage footageSizeThatBestFitsRect:self.bounds] withCompletionHandler:^(UIImage *image) {
if ([weakSelf.delegate respondsToSelector:#selector(galleryPhotoCollectionViewCell:didLoadImage:)])
{
[weakSelf.delegate galleryPhotoCollectionViewCell:weakSelf didLoadImage:image];
}
image = nil;
}];

Is there a way to load a text string first and then the image using AFNetworking?

I'm using AFNetworking to parse JSON to my app (using Rails as my backend). Right now my app is very slow so I'm trying to figure out a way to make it smoother. When I first load the app it takes a few seconds for it to populate (it shows the Nav items and a white page, then a few seconds later my "posts" appear).
Collection View Controller
- (void)viewDidLoad
{
[super viewDidLoad];
self.upcomingReleases = [[NSMutableArray alloc] init];
[self makeReleasesRequests];
[self.collectionView registerClass:[ReleaseCell class] forCellWithReuseIdentifier:#"ReleaseCell"];
}
-(void)makeReleasesRequests
{
NSURL *url = [NSURL URLWithString:#"http://www.soleresource.com/upcoming.json"];
NSURLRequest *request = [NSURLRequest requestWithURL:url];
AFHTTPRequestOperation *operation = [[AFHTTPRequestOperation alloc] initWithRequest:request];
operation.responseSerializer = [AFJSONResponseSerializer serializer];
[operation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
NSLog(#"#");
self.upcomingReleases = [responseObject objectForKey:#"upcoming_releases"];
[self.collectionView reloadData];
} failure:nil];
[operation start];
}
-(NSInteger)numberOfSectionsInCollectionView:(UICollectionView *)collectionView
{
return 1;
}
-(NSInteger)collectionView:(UICollectionView *)collectionView numberOfItemsInSection:(NSInteger)section
{
return [self.upcomingReleases count];
}
#pragma mark - Show upcoming release shoe
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {
static NSString *identifier = #"Cell";
ReleaseCell *cell = [collectionView dequeueReusableCellWithReuseIdentifier:identifier forIndexPath:indexPath];
NSDictionary *upcomingReleaseDictionary = [self.upcomingReleases objectAtIndex:indexPath.row];
NSString *thumbURL = nil;
cell.release_name.text = [NSString stringWithFormat:#"%# — $%#",[upcomingReleaseDictionary objectForKey:#"release_name"], [upcomingReleaseDictionary objectForKey:#"release_price"]];
if ([upcomingReleaseDictionary[#"images"] isKindOfClass:[NSArray class]] && [upcomingReleaseDictionary[#"images"] count]) {
thumbURL = upcomingReleaseDictionary[#"images"][0][#"image_file"][#"image_file"][#"thumb"][#"url"];
if (thumbURL)
{
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:thumbURL]];
UIImage *image = [UIImage imageWithData:imageData];
cell.thumb.image = image;
}
}
else {
cell.thumb.image = [UIImage imageNamed:#"air-jordan-5-fear.png"];
}
return cell;
}
Each of my posts has a text string and a image. Is there a way to load the text so that it appears right away and then load my image? Or is there another way to speed up my app load speed (Maybe loadin a certain of posts first and then loading the rest - the ones that the user cant see until they scroll down).
Thanks.
You should load your image lazily and asynchronously (DON'T block main thread) when coming from server. (AFNetworking already has caching category method on UIImageView. (Check out this for more)
if (thumbURL)
{
[cell.thumb setImageWithURL:[NSURL URLWithString:thumbURL] placeholderImage:[UIImage imageNamed:#"air-jordan-5-fear.png"]];
}
EDIT -
Ensure to pull UIKit+AFNetworking folder into your project and #import "UIKit+AFNetworking.h" into your .m file. The link to download complete AFNetworking can be found here and documentation specific to this question here.
Your problem is this:
if (thumbURL)
{
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:thumbURL]];
UIImage *image = [UIImage imageWithData:imageData];
cell.thumb.image = image;
}
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:thumbURL]];
UIImage *image = [UIImage imageWithData:imageData];
You should never be getting data in cellForItemAtIndexPath:. You should only be displaying what you have already. Your code makes it so no cell is returned until a thumbnail is downloaded. You can measure this using the Time Profiler in Instruments.
I'm assuming thumb is a UIImageView. Try this:
if (thumb) {
[thumb setImageWithURL:[NSURL URLWithString:#"http://i.imgur.com/r4uwx.jpg"]];
}
This method, also included with AFNetworking, will download the image, and update it in the cell once it's done downloading. Documentation and other similar methods are here.

Saving picture from app

I am trying to make my application open the camera app to take a save pictures.
from my application i am launching the camera application to take a picture with the following code:
-(IBAction)TakePhoto:(id)sender {
picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
[picker setSourceType:UIImagePickerControllerSourceTypeCamera];
[self presentViewController:picker animated:YES completion:NULL];
[picker release];
//save image??:
//UIImageWriteToSavedPhotosAlbum(UIImage *image, id completionTarget, SEL completionSelector, void *contextInfo);
}
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
image = [info objectForKey:UIImagePickerControllerOriginalImage];
[self dismissViewControllerAnimated:YES completion:NULL];
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
[self dismissViewControllerAnimated:YES completion:NULL];
}
My problem is that once the button is pressed, the camera comes out and allows me to take a picture. Once the picture is take, the image is shown and i have the option to "retake" or "use". My issue is that if i click "use" the image is not saved to the camera roll. Is there a possibility to save the image and eventually change the "use" button to say "save"?
Thank you for you help!
The photo isn't being saved because you never actually added the code to save the image to the didFinishPickingMediaWithInfo: delegate method. All you have to do is add the line that you commented out in TakePhoto: to this function and you will be able to save the photo to the camera roll. E.x:
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
image = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, NULL);
[self dismissViewControllerAnimated:YES completion:NULL];
}
static NSDateFormatter* dateFormatter;
- (NSString*) generateNameWithExtension: (NSString*) extension
{
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
dateFormatter = [[NSDateFormatter alloc] init];
dateFormatter.dateFormat = #"yyyyMMddHHmmssSSS";
});
NSDate* now = [NSDate date];
NSString* string = [dateFormatter stringFromDate:now];
string = [NSString stringWithFormat:#"%#.%#",string,extension];
return string;
}
- (NSString*) saveImage: (UIImage*) image WithExtension: (NSString*) extension
{
extension = [extension lowercaseString];
NSData* data;
if ([extension isEqualToString:#"png"])
{
data = UIImagePNGRepresentation(image);
}else if ([extension isEqualToString:#"jpg"]||[extension isEqualToString:#"jpeg"])
{
data = UIImageJPEGRepresentation(image, 1.0);
}else{
NSLog(#"Error save local image, Extension: (%#) is not recognized, use (PNG/JPG)",extension);
return nil;
}
NSString* imageName = [self generateNameWithExtension:extension];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documents = [paths objectAtIndex:0];
NSString *finalPath = [documents stringByAppendingPathComponent:imageName];
[data writeToFile:finalPath options:NSAtomicWrite error:nil];
return finalPath;
}
This code save image in folder you app. You can later use it: [UIImage imageWithContentsOfFile:finalPath]

Optimizing an array of UIImages

So I'm building an app that the user takes pictures of themselves, it saves them to the camera roll, and I'm saving references to the asset URLs to display them in the app. At first this model seemed to work fine, but as I took more and more pictures it started receiving memory warnings and eventually crashed. Is there a better way to approach this?
This is how I load up the saved photos at the launch of the app (which freezes the app for up to 10 seconds depending on how many are being loaded):
- (void) loadPhotosArray
{
_photos = [[NSMutableArray alloc] init];
NSData* data = [[NSUserDefaults standardUserDefaults] objectForKey: #"savedImages"];
if (data)
{
NSArray* storedUrls = [[NSArray alloc] initWithArray: [NSKeyedUnarchiver unarchiveObjectWithData: data]];
// reverse array
NSArray* urls = [[storedUrls reverseObjectEnumerator] allObjects];
for (NSURL* assetUrl in urls)
{
// Block to handle image handling success
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
UIImage* tempImage = [UIImage imageWithCGImage:iref];
UIImage* image = [[UIImage alloc] initWithCGImage: tempImage.CGImage scale: 1.0 orientation: UIImageOrientationRight];
// Set image in imageView
[_photos addObject: image];
[[NSNotificationCenter defaultCenter] postNotificationName: #"PhotosChanged" object: self];
}
};
// Handles failure of getting image
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Can't get image - %#",[myerror localizedDescription]);
};
// Load image then call appropriate block
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL: assetUrl
resultBlock: resultblock
failureBlock: failureblock];
}
}
else
{
NSLog(#"Photo storage is empty");
}
}
And saving photos:
- (void) addImageToPhotos: (UIImage*)image
{
// Store image at front of array
NSMutableArray* temp = [[NSMutableArray alloc] initWithObjects: image, nil];
// load rest of images onto temp array
for (UIImage* image in _photos)
{
[temp addObject: image];
}
_photos = nil;
_photos = [[NSMutableArray alloc] initWithArray: temp];
// [self.photos addObject: image];
[[NSNotificationCenter defaultCenter] postNotificationName: #"PhotosChanged" object: self.photos];
// save to cache
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library saveImage: image toAlbum: #kAlbumeName withCompletionBlock:^(NSError *error) {
if (error)
{
NSLog(#"Error saving");
}
}];
}
I think have 2 methods to optimize this problem.
U should just save image name string instead of saving UIImage object, then when need to display the image, use pagination to display image according to saved image name string.
U should use multi-thread to deal with this long time task, recommend u to use gcd to load image name string.

Resources