I have a larger app that should be able to share multiple images.
I implemented this using UIActivityViewController and UIActivityItemProvider to have asynchronous usage of the items (so that i only have to prepare one image at a time and not have to fill the memory with all of them at once to share them).
I was "inspired" by the Apple Airdrop example:
AirdropSample download
However when using my app to share e.g. 9 images (to camera Roll == "Save 9 images") only 4 to 7 images end up being in the camera roll, no error messages whatsoever.
If i repeat it over and over sometimes i get 5 images or 6 seemingly random.
I cannot post my app here, but i modified the above sample in a way that it will also randomly "fail" with delivering all images to the camera Roll...
If you download above sample and replace 4 files with these, it shows the problem:
APLAsyncImageViewController.h:
#import <UIKit/UIKit.h>
#import "APLAsyncImageActivityItemProvider.h"
#interface APLAsyncImageViewController : UIViewController
#end
APLAsyncImageViewController.m:
#import "APLAsyncImageViewController.h"
#import "APLProgressAlertViewController.h"
NSString * const kProgressAlertViewControllerIdentifier = #"APLProgressAlertViewController";
#interface APLAsyncImageViewController ()
#property (strong, nonatomic) UIWindow *alertWindow;
#property (strong, nonatomic) APLProgressAlertViewController *alertViewController;
#property (strong, nonatomic) UIPopoverController *activityPopover;
#property (weak, nonatomic) IBOutlet UIButton *shareImageButton;
- (IBAction)openActivitySheet:(id)sender;
#end
#implementation APLAsyncImageViewController
- (IBAction)openActivitySheet:(id)sender
{
NSMutableArray *itemArray = [[NSMutableArray alloc] init];
for( int i = 0; i < 9;i++)
{
APLAsyncImageActivityItemProvider *aiImageItemProvider = [[APLAsyncImageActivityItemProvider alloc] init];
[itemArray addObject: aiImageItemProvider];
}
//Create an activity view controller with the activity provider item. UIActivityItemProvider (AsyncImageActivityItemProvider's superclass) conforms to the UIActivityItemSource protocol
UIActivityViewController *activityViewController = [[UIActivityViewController alloc] initWithActivityItems:itemArray applicationActivities:nil];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
//iPhone, present activity view controller as is
[self presentViewController:activityViewController animated:YES completion:nil];
}
else
{
//iPad, present the view controller inside a popover
if (![self.activityPopover isPopoverVisible]) {
self.activityPopover = [[UIPopoverController alloc] initWithContentViewController:activityViewController];
[self.activityPopover presentPopoverFromRect:[self.shareImageButton frame] inView:self.view permittedArrowDirections:UIPopoverArrowDirectionAny animated:YES];
}
else
{
//Dismiss if the button is tapped while pop over is visible
[self.activityPopover dismissPopoverAnimated:YES];
}
}
}
#end
APLAsyncImageActivityItemProvider.h:
#import <UIKit/UIKit.h>
#interface APLAsyncImageActivityItemProvider : UIActivityItemProvider
#end
APLAsyncImageActivityItemProvider.m:
#import "APLAsyncImageActivityItemProvider.h"
#import "UIImage+Resize.h"
#implementation APLAsyncImageActivityItemProvider
- (id)activityViewControllerPlaceholderItem:(UIActivityViewController *)activityViewController
{
return [[UIImage alloc] init];
}
- (id)item
{
UIImage *image = [UIImage imageNamed:#"Flower.png"];
//CGSize imageSize;
//imageSize.height = 1000;
//imageSize.width = 1000;
//image = [UIImage imageWithImage:image scaledToFitToSize:imageSize];
return image;
}
- (UIImage *)activityViewController:(UIActivityViewController *)activityViewController thumbnailImageForActivityType:(NSString *)activityType suggestedSize:(CGSize)size
{
//The filtered image is the image to display on the other side.
return [[UIImage alloc] init];
}
#end
If you execute the sample like this (Use menu item "Send Image After Preprocessing", press "SHARE" Button) it will often or mostly fail to deliver all 9 images to the camera roll.
IF you uncomment the 4 lines in "APLAsyncImageActivityItemProvider.m" that basically just scale the output image THEN it will work ALWAYS.
Can you tell me why ? I feel that if i know the answer to that riddle i can also fix my app.
Thank you,
Nils
The reason for this issue is pretty simple.
There are limited number of writer threads available to use by the app, so writing too many images simulteneously may fail due to "Write Busy" error. You can verify that by trying to manually write these images using UIImageWriteToSavedPhotosAlbum (or corresponding Asset/Photos framework counterparts).
So, resizing images just hides the real problem because writing a smaller image takes less time.
The problem can be solved by liming the number of writers and/or retrying in case of error.
Also note that -[UIActivityItemProvider item] is a blocking call, but there are no synchronous ways to write to the gallery out of the box. This can be handled with dispatch_semaphore_wait , NSCondition, etc.
Or, in case you're using ReactiveCocoa, by simply calling waitUntilCompleted.
I would go to say that the [UIImage imageWithImage:image scaledToFitToSize:imageSize] image created is GC before saved to camera roll, but I don't know what is going on in that method without the code. That method is not a normal UIImage method, a category maybe?
EDIT
Replace your for loop code adding to your item array with this code, and then uncomment your resize lines, and see if that works.
NSMutableArray *itemArray = [[NSMutableArray alloc] init];
for( int i = 0; i < 9;i++)
{
UIImage *aiImage = [[[APLAsyncImageActivityItemProvider alloc] init] image];
[itemArray addObject:aiImage];
}
Related
In my project I manually create views without using storyboard. I have tried to play a Video when I tap an image. It works fine. But everytime when I check it shows memory leak when I tap the image, I have searched about this a lot and applied but no use.
In my Appdelegate.h file:
#property (strong, nonatomic) MPMoviePlayerController *theMoviePlayer;
#property (strong, nonatomic) UIImageView *image1;
In .m file:
-(void) startPage{
.....
_image1 = [[UIImageView alloc] initWithFrame:CGRectMake((self.window.frame.size.width/2)-25, 40, 50, 50)];
[_image1 setUserInteractionEnabled:YES];
_image1.image = [UIImage imageNamed:#"image_2.jpg"];
_tapImage1 = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(image1Tapped:)];
[_image1 addGestureRecognizer:_tapImage1];
.....}
In imageTapped(),
-(void) image1Tapped:(UITapGestureRecognizer *)sender
{
.....
[_image1 removeFromSuperview];
_theMoviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:movieURL];
[_theMoviePlayer setControlStyle:MPMovieControlStyleFullscreen];
[_theMoviePlayer.view setFrame:CGRectMake(0,-55, self.window.frame.size.width, self.window.frame.size.height)];
[_theMoviePlayer setScalingMode:MPMovieScalingModeAspectFill];
UIWindow *backgroundWindow = [[UIApplication sharedApplication] keyWindow];
[backgroundWindow addSubview:_theMoviePlayer.view];
[_theMoviePlayer.view bringSubviewToFront:backgroundWindow];
[_theMoviePlayer play];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(movieFinished:)
name:MPMoviePlayerPlaybackDidFinishNotification object:_theMoviePlayer];
...}
I get a memory leak everytime it enters imageTapped: method. Any help will be appreciated.
#property (strong, nonatomic) MPMoviePlayerController *theMoviePlayer;
#property (weak, nonatomic) UIImageView *image1;
another think is that your theMoviePlayer is not remove, try to make its view transparency for see if is already working behind the new one
I am trying to help you.
In -(void) startPage you allocate _image1 object
And you are removing object in method -(void) image1Tapped:(UITapGestureRecognizer*)sender
using [_image1 removeFromSuperview]; method
means now _image1 is nil and when -(void) image1Tapped:(UITapGestureRecognizer *)sender method called at that time when u fetch _image1 object at that time _image1 is already nil
so its give Memory Leak warning.
And solution for this one is:
1.**Show/Hide _image1 object** or
every time you need to do proper allocation and remove image1 object in this method -(void) image1Tapped:(UITapGestureRecognizer *)sender, based your requirement.
First try this solution and there will be Memory Leak Warning removed.
The Compiler check all over steps in advance so it recognize as Warning.
In Some cases if your logic is wrong then compiler inform us Wrong logic.
if u want to check that click on blue arrow of memory warning button it will give exaplanation of your logic or warning with assumption.
I found the problem, its with the device iPad(iOS version 5). It shows no leaks when I checked with iPad4 (iOS version 7).
I am creating a real estate app. I have a screen which displays a listing of all entries with a thumbnail and a little text on the side. These I have loaded from the server when the app launched. Each entry can have up to 5 photos, which I do not pre-load for obvious reasons. My issue is this… when the user selects an entry, the app downloads the larger photos from the server. Depending on circumstances this can take a few seconds. Right now the app just hangs for those few seconds. I don't know of any practical way to use an activity indicator in a list. A header space just seems like wasted space to use only to display"Loading…". Anyone have any ideas on what I can do to let the user know that loading is in progress?
Clarification: Once an entry is selected from the list, I load up another Table View Controller which has the photos in its list of selections. I currently load the photos in the ViewDidLoad using
NSData *myPhoto = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:myURL]];
You can:
Use UIActivityIndicatorView to show a spinning activity indicator in the precise spot where the image will eventually be loaded.
In a separate queue download the image. While the below code uses GCD, it's actually much better to use NSOperationQueue because on a slow network, using GCD can consume all of the available worker threads, detrimentally affecting performance on the app. A NSOperationQueue with a reasonable maxConcurrentOperationCount (such as 4 or 5) is much better.
When the download is complete, dispatch the updating of the UI back to the main queue (e.g. turn off the activity indicator and set the image).
This is sample code from a gallery app that shows how you might do it. This is probably more complicated than you need and might be hard to repurpose via cut-and-paste, but the loadImage method shows the basic elements of the solution.
#interface MyImage : NSObject
#property (nonatomic, strong) NSString *urlString;
#property (nonatomic, strong) UIImageView *imageView;
#property (nonatomic, strong) UIActivityIndicatorView *activityIndicator;
#property (nonatomic, strong) UIView *view;
#property BOOL loading;
#property BOOL loaded;
#end
#implementation MyImage
// I find that I generally can get away with loading images in main queue using Documents
// cache, too, but if your images are not optimized (e.g. are large), or if you're supporting
// older, slower devices, you might not want to use the Documents cache in the main queue if
// you want a smooth UI. If this is the case, change kUseDocumentsCacheInMainQueue to NO and
// then use the Documents cache only in the background thread.
#define kUseDocumentsCacheInMainQueue NO
- (id)init
{
self = [super init];
if (self)
{
_view = [[UIView alloc] initWithFrame:CGRectMake(0.0, 0.0, IMAGE_WIDTH, IMAGE_HEIGHT)];
_imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, IMAGE_WIDTH, IMAGE_HEIGHT)];
_imageView.contentMode = UIViewContentModeScaleAspectFill;
_imageView.clipsToBounds = YES;
[_view addSubview:_imageView];
_loading = NO;
_loaded = NO;
}
return self;
}
- (void)loadImage:(dispatch_queue_t)queue
{
if (self.loading)
return;
self.loading = YES;
ThumbnailCache *cache = [ThumbnailCache sharedManager];
if (self.imageView.image == nil)
{
// I've implemented a caching system that stores images in my Documents folder
// as well as, for optimal performance, a NSCache subclass. Whether you go through
// this extra work is up to you
UIImage *imageFromCache = [cache objectForKey:self.urlString useDocumentsCache:kUseDocumentsCacheInMainQueue];
if (imageFromCache)
{
if (self.activityIndicator)
{
[self.activityIndicator stopAnimating];
self.activityIndicator = nil;
}
self.imageView.image = imageFromCache;
self.loading = NO;
self.loaded = YES;
return;
}
// assuming we haven't found it in my cache, then let's see if we need to fire
// up the spinning UIActivityIndicatorView
if (self.activityIndicator == nil)
{
self.activityIndicator = [[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray];
self.activityIndicator.center = CGPointMake(self.view.frame.size.width / 2.0, self.view.frame.size.height / 2.0);
[self.view addSubview:self.activityIndicator];
}
[self.activityIndicator startAnimating];
// now, in the background queue, let's retrieve the image
dispatch_async(queue, ^{
if (self.loading)
{
UIImage *image = nil;
// only requery cache for Documents cache if we didn't do so in the main
// queue for small images, doing it in the main queue is fine, but apps
// with larger images, you might do this in this background queue.
if (!kUseDocumentsCacheInMainQueue)
image = [cache objectForKey:self.urlString useDocumentsCache:YES];
// if we haven't gotten the image yet, retrieve it from the remote server
if (!image)
{
NSData *data = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:self.urlString]];
if (data)
{
image = [UIImage imageWithData:data];
// personally, I cache my image to optimize future access ... you might just store in the Documents folder, or whatever
[cache setObject:image forKey:self.urlString data:data];
}
}
// now update the UI in the main queue
dispatch_async(dispatch_get_main_queue(), ^{
if (self.loading)
{
[self.activityIndicator stopAnimating];
self.activityIndicator = nil;
self.imageView.image = image;
self.loading = NO;
self.loaded = YES;
}
});
}
});
}
}
// In my gallery view controller, I make sure to unload images that have scrolled off
// the screen. And because I've cached the images, I can re-retrieve them fairly quickly.
// This sort of logic is critical if you're dealing with *lots* of images and you want
// to be responsible with your memory.
- (void)unloadImage
{
// remove from imageview, but not cache
self.imageView.image = nil;
self.loaded = NO;
self.loading = NO;
}
#end
By the way, if the image you're downloading is in a UIImageView in a UITableViewCell the final update back to the table might want to do something about checking to see if the cell is still on screen (to make sure it wasn't dequeued because the UITableViewCell scrolled off the screen). In that case, the final UI update after successful download of the image might do something like:
dispatch_async(dispatch_get_main_queue(), ^{
// if the cell is visible, then set the image
UITableViewCell *cell = [self.tableView cellForRowAtIndexPath:indexPath];
if (cell)
{
cell.imageView.image = image;
}
});
Note, this is using the UITableView method cellForRowAtIndexPath, which should not be confused with the UITableViewController method tableView:cellForRowAtIndexPath.
For one of my projects i used this custom class for UIImageView:
https://github.com/nicklockwood/AsyncImageView
Small tutorial is located here: http://www.markj.net/iphone-asynchronous-table-image/
With just few lines of code i managed to implement asynchronous loading of images, caching etc. Just give it a look.
Good morning.
I'm having trouble with a UIImagePickerController not showing anything other than white. I have a 320x320 UIImageView, in my main header file:
IBOutlet UIImageView *_cameraView;
Along with:
#property (nonatomic, retain) UIImagePickerController *_cameraPicker;
#property (nonatomic, retain) UIImage *_noCam;
_noCam is an image which is placed in the view if the user has no camera. That works fine.
I'm trying to show (without the standard camera GUI, but that can wait for now) the camera on top of this square view. In my main class I have:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
NSLog(#"viewDidAppear did appear too");
//camera view etc
if (![UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
//code here to not allow picture taking, probably put up something saying you don'thave a camera on your device
NSLog(#"no cam available");
_noCam = [UIImage imageNamed:#"noCam.png"];
[_cameraView setImage:_noCam];
} else {
//take the shot
_cameraPicker = [[UIImagePickerController alloc] init];
_cameraPicker.delegate = _cameraView;
_cameraPicker.sourceType = UIImagePickerControllerSourceTypeCamera;
NSLog(#"cam is available");
}
}
I'm guessing I have a problem with setting the delegate. _self gives me an error of type:
'from incompatible type 'ViewController *const _strong'
I'm using NSLogs to check the methods are working, it tells me the viewDidLoad and viewDidAppear method were called and that a camera was detected But then I just get the white view.
Any help would be hugely appreciated.
Thanks.
You need to display the picker somehow, you can either use the default GUI:
[self presentModalViewController:_cameraPicker animated:YES];
Or check out the cameraOverlayView of UIImagePickerController.
In my app I need to display stores around the user location. Each store has name, tagline, and a logo, and we want to display all these information on the callout bubble that comes up on the map once I touch a pin. Considering that I need to load the image remotely, and that waiting three seconds to see the callout after touching the pin is not acceptable, what would be the best solution?
The file of an array of around 20 stores is like 10kb, but if we load the logo for all of them straight away, maybe it would be like 110kb (considering an estimate of 5kb per image), which I`m not sure if it is a good idea.
In one of my project that case works just fine.
I'm using SDWebImage for the remote async load of the image.
I did:
subclass the MKPinAnnotationView:
.h
#interface TLStoreResultMapAnnotationView : MKPinAnnotationView
#property (assign)BOOL imageSet;
#end
.m
#import "TLStoreResultMapAnnotationView.h"
#import "TLStoreResultMapAnnotation.h"
#import "UIImageView+WebCache.h"
#implementation TLStoreResultMapAnnotationView
#synthesize imageSet=_imageSet;
- (void)layoutSubviews {
if(self.selected && (!self.imageSet)) {
TLStoreResultMapAnnotation *annotation = (TLStoreResultMapAnnotation *)self.annotation;
NSURL *url = [NSURL URLWithString:[annotation.store.imageURL stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]];
UIImageView *storeImageView = (UIImageView *)self.leftCalloutAccessoryView;
storeImageView.frame = CGRectMake(storeImageView.frame.origin.x,storeImageView.frame.origin.y,30.0,30.0);
storeImageView.contentMode = UIViewContentModeScaleAspectFill;
storeImageView.clipsToBounds = YES;
[storeImageView setImageWithURL:url
placeholderImage:[UIImage imageNamed:#"webloading.png"] options:SDWebImageCacheMemoryOnly];
self.imageSet = YES;
}
[super layoutSubviews];
UIImageView *storeImageView = (UIImageView *)self.leftCalloutAccessoryView;
storeImageView.frame = CGRectMake(storeImageView.frame.origin.x,storeImageView.frame.origin.y,30.0,30.0);
}
#end
of course your need to adapt the code a bit.
Here is my code:
Header:
#import <UIKit/UIKit.h>
#interface ViewController : UIViewController {
UIImageView *imageView;
NSMutableArray *arrayWithImages;
}
- (IBAction)startAnimation:(id)sender;
- (IBAction)cleanMemory:(id)sender;
#end
Implementation:
#import "ViewController.h"
#implementation ViewController
......
- (IBAction)startAnimation:(id)sender {
imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 768, 1024)];
arrayWithImages = [[NSMutableArray alloc] initWithObjects:
[UIImage imageNamed:#"pic1"],
[UIImage imageNamed:#"pic2"],
[UIImage imageNamed:#"pic3"],
[UIImage imageNamed:#"pic4"],
[UIImage imageNamed:#"pic5"],
[UIImage imageNamed:#"pic6"],
[UIImage imageNamed:#"pic7"],
[UIImage imageNamed:#"pic8"],nil];
imageView.animationImages = arrayWithImages;
imageView.animationDuration = 3;
imageView.animationRepeatCount = 1;
[self.view addSubview:imageView];
[imageView startAnimating];
}
- (IBAction)cleanMemory:(id)sender {
[arrayWithImages removeAllObjects];
[arrayWithImages release];
arrayWithImages= nil;
[imageView removeFromSuperview];
[imageView release];
imageView = nil;
}
#end
I have ViewController and its view with two buttons. First button with startAnimation action, which creates UIImageView , NSMutableArray and starts animation on it. Second button with cleanMemory action , which clean all what i've created in startAnimation.
When i start Profile with Activity Monitor instrument, my program have 4 mb Real Mem, when i press startAnimation button it's changes to 16 mb Real Mem and after animation i press cleanMemory button, but it has same 16 mb Real Mem... Why? I wont to clean my memory to started value( 4 mb Real Mem). Please, can you explain, where i have problems?
UIImage imageNamed: caches the images and will release the memory on it's own schedule. If you do not want the caching, that is completely control the memory then load the image directly,not with UIImage imageNamed:.
From the Apple docs:
This method looks in the system caches for an image object with the
specified name and returns that object if it exists. If a matching
image object is not already in the cache, this method loads the image
data from the specified file, caches it, and then returns the
resulting object.
You can use
+ (UIImage *)imageWithContentsOfFile:(NSString *)path
to load the image directly.
From the Apple docs:
This method does not cache the image object.
If even after using + (UIImage *)imageWithContentsOfFile:(NSString *)path you still don't get to free memory, tyr calling imageView.animationImages = nil;