Delay loading images in UIImageView - ios

I'm trying to loop through some images in a single UIImageView when I tap a button. The image must disappear 0.1 seconds after the button is pressed.
Here's the code:
int tapCount = 0;
UIImage *image0 = [UIImage imageNamed:#"0.jpg"];
UIImage *image1 = [UIImage imageNamed:#"1.jpg"];
UIImage *image2 = [UIImage imageNamed:#"2.jpg"];
imagesArray = [[NSArray alloc] initWithObjects:image0, image1, image2, nil];
-(IBAction)backgroundButton:(id)sender{
self.myImageView.image = [imagesArray objectAtIndex:tapCount%3];
tapCount++;
[self performSelector:#selector(eraseImage) withObject:self afterDelay:0.1];
}
-(void)eraseImage{
self.myImageView.image = nil;
}
The problem is that the images don't appear until I've completed one entire loop (at the 4th tap).
I'm guessing that somehow I must initialize the images in the UIImageView because it takes some time between the tapping and the image appearing, and since it disappears after 0.1 seconds...it doesn't show at all.
I've tried loading them inside viewDidLoad like this:
for(int i = 0; i<[imagesArray count]; i++){
self.myImageView.image = [imagesArray objectAtIndex : i];
}
But it only works with the last image that loads (image2 in this case).
Should I loop between different UIImageView instead of looping through different UIImage inside a single UIImageView?
Any other hints?

Creating a UIImage doesn't actually load the image data (you need to render it to a context for that to happen). So, if your images are large then you could be hiding them before they are actually rendered to the screen. You won't be able to hold many images in memory at the same time, but you can force the image data to be loaded by creating a context and drawing the image into it (which can be done in the background, using CGContextDrawImage).
There are a few 3rd party bits of code which do this, like this or check this discussion.

Use the animationImages and animationDuration property of the UIImageView
https://developer.apple.com/library/ios/documentation/uikit/reference/UIImageView_Class/Reference/Reference.html#//apple_ref/occ/instp/UIImageView/animationImages

I think there is a much simpler way to achieve that animation you are going for. Try the following code:
-(IBAction)backgroundButton:(id)sender{
[UIView animateWithDuration:0.2
delay:nil
options:UIViewAnimationCurveEaseIn
animations:^{
self.myImageView.image = [imagesArray objectAtIndex:tapCount%3];
self.myImageView.image = nil;
}
completion:nil
];
tapCount++;
if (tapCount == 2) {
tapCount = 0;
}
}

I finally managed to work this around using this solution:
First I preload all the images in the background thread
-(void)preload:(UIImage *)image{
CGImageRef ref = image.CGImage;
size_t width = CGImageGetWidth(ref);
size_t height = CGImageGetHeight(ref);
CGColorSpaceRef space = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, width, height, 8, width * 4, space, kCGBitmapAlphaInfoMask & kCGImageAlphaPremultipliedFirst);
CGColorSpaceRelease(space);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), ref);
CGContextRelease(context);
}
Then I execute the same action I had in the beginning:
-(IBAction)backgroundButton:(id)sender{
self.myImageView.image = [imagesArray objectAtIndex:tapCount%3];
tapCount++;
[self performSelector:#selector(eraseImage) withObject:self afterDelay:0.1];
}
-(void)eraseImage{
self.myImageView.image = nil;
}

Related

NSMutableArray creates issue for images

I have an array with dictionaries in it. Each dictionary contains UIImage & String values.
But when I try to delete object using
[arr removeObjectAtIndex:button.tag];
then it just decreases the count but image (Object) is not deleted.
So, my Images are overlapping when try to delete.
It also creates problem when I try to add objects in the array
[arr insertObject:dict atIndex:0];
so, I used
[_postObj.arr_images addObject:dict]; instead of insertObject
Help me to solve this
Objects use reference counting and both NSMutableArray and UIImageView will retain the UIImage object.
This means that removing it from the array will not automatically remove it from the UIImageView and you must remove it from both explicitly:
[arr removeObjectAtIndex:button.tag];
_imageView.image = nil;
Problem is in your frame setting of UIImageView. Please try using below code (not tested by myself) and see if this fixes the issue for you. Basically, I am adjusting X position of images based on previous image's width.
CGFloat imageXM = 5.0;
CGFloat imageXPos = 0;
for (UIImage *image in arr_images) {
CGRect imageFrame = CGRectMake(imageXPos + imageXM, 0.0, image.size.width, image.size.height);
imageXPos = imageXPos + image.size.width;
UIImageView *imageView = [[UIImageView alloc] initWithFrame:imageFrame];
imageView.contentMode = UIViewContentModeScaleAspectFit;
imageView.image = image;
[scl addSubview:imageView];
}
As in your code
[picker dismissViewControllerAnimated:YES completion:^{
[arr_images insertObject:chosenImage atIndex:0];
[self scrollImages_Setup2];
}];
[arr_images insertObject:chosenImage atIndex:0]; that means you need only one image to display on scroll view. Then why you take a loop:
-(void)scrollImages_Setup2
{
for (int k=0; k<arr_images.count; k++) {
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake((CGRectGetWidth(scl.frame) * k) + CGRectGetWidth(scl.frame), 0, CGRectGetWidth(scl.frame), CGRectGetHeight(scl.frame))];
imageView.contentMode = UIViewContentModeScaleAspectFit;
imageView.image=[arr_images objectAtIndex:k] ;
[scl addSubview:imageView];
}
scl.contentSize = CGSizeMake((CGRectGetWidth(scl.frame) * arr_images.count)+CGRectGetWidth(scl.frame), CGRectGetHeight(scl.frame));
}
You just right the code only for display the Image. you don't need to scroll view for displaying only one image. I think you are not clear what you want.
Your problem regarding the overwriting the images because of loop. so remove the loop, when you display a single image. Please remove the scrollview contentSize, because image_array will increase, when you add the multiple images in array and size width will be large. so remove arr_images.count as well.

Can the new tintColor property of UIImageview in iOS 7 be used for animating images?

tintColor is a life saver, it takes app theming to a whole new (easy) level.
//the life saving bit is the new UIImageRenderingModeAlwaysTemplate mode of UIImage
UIImage *templateImage = [[UIImage imageNamed:#"myTemplateImage"] imageWithRenderingMode:UIImageRenderingModeAlwaysTemplate];
imageView.image = templateImage;
//set the desired tintColor
imageView.tintColor = color;
The above code will "paint" the image's non-transparent parts according to the UIImageview's tint color which is oh so cool.No need for core graphics for something simple like that.
The problem I face is with animations.
Continuing from the above code:
//The array with the names of the images we want to animate
NSArray *imageNames = #[#"1",#"2"#"3"#"4"#"5"];
//The array with the actual images
NSMutableArray *images = [NSMutableArray new];
for (int i = 0; i < imageNames.count; i++)
{
[images addObject:[[UIImage imageNamed:[imageNames objectAtIndex:i]] imageWithRenderingMode:UIImageRenderingModeAlwaysTemplate]];
}
//We set the animation images of the UIImageView to the images in the array
imageView.animationImages = images;
//and start animating the animation
[imageView startAnimating];
The animation is performed correctly but the images use their original color (i.e. the color used in the gfx editing application) instead of the UIImageView's tintColor.
I am about to try to perform the animation myself (by doing something a little bit over the top like looping through the images and setting the UIImageView's image property with a NSTimer delay so that the human eye can catch it).
Before doing that I'd like to ask if the tintColor property of UIImageView is supposed to support what I'm trying to do with it i.e use it for animations.
Thanks.
Rather than animate the images myself, I decided to render the individual frames using a tint color and then let UIImage do the animation. I created a category on UIImage with the following methods:
+ (instancetype)animatedImageNamed:(NSString *)name tintColor:(UIColor *)tintColor duration:(NSTimeInterval)duration
{
NSMutableArray *images = [[NSMutableArray alloc] init];
short index = 0;
while ( index <= 1024 )
{
NSString *imageName = [NSString stringWithFormat:#"%#%d", name, index++];
UIImage *image = [UIImage imageNamed:imageName];
if ( image == nil ) break;
[images addObject:[image imageTintedWithColor:tintColor]];
}
return [self animatedImageWithImages:images duration:duration];
}
- (instancetype)imageTintedWithColor:(UIColor *)tintColor
{
CGRect imageBounds = CGRectMake( 0, 0, self.size.width, self.size.height );
UIGraphicsBeginImageContextWithOptions( self.size, NO, self.scale );
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextTranslateCTM( context, 0, self.size.height );
CGContextScaleCTM( context, 1.0, -1.0 );
CGContextClipToMask( context, imageBounds, self.CGImage );
[tintColor setFill];
CGContextFillRect( context, imageBounds );
UIImage *tintedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return tintedImage;
}
It works just like + [UIImage animatedImageNamed:duration:] (including looking for files named "image0", "image1", etc) except that it also takes a tint color.
Thanks to this answer for providing the image tinting code: https://stackoverflow.com/a/19152722/321527
.tintColor can probably handle it. I use NSTimers for UIButton's setTitleColor method all the time. Here's an example.
UPDATED: Tested and works on iPhone 5s iOS 7.1!
- (void)bringToMain:(UIImage *)imageNam {
timer = [NSTimer scheduledTimerWithTimeInterval:.002
target:self
selector:#selector(animateTint)
userInfo:nil
repeats:YES];
}
- (void)animateTint {
asd += 1.0f;
[imageView setTintColor:[UIColor colorWithRed:((asd/100.0f) green:0.0f blue:0.0f alpha:1.0f]];
if (asd == 100) {
asd = 0.0f
[timer invalidate];
}
}

Generate and store many images at first launch iOS

I need to generate and save 320 images as PNGs when the game is first run. These images will then be loaded instead of being generated again. Here is the process:
load image template (black and white with alpha)
overlay non transparent pixels with specified colour
put on top the template at 0.3 opacity merging it to one final image
return back UIImage
store the UIImage, converted to NSData to PNG in Cache directory
This is done using UIGraphicsBeginImageContextWithOptions. This process needs to be done for 32 image templates in 10 colours on the background thread. The purpose is that these will be used as avatar/profile images in this game, scaled down at certain screens as appropriate. They cannot be generated every time though, because this causes too much lag.
The images are 400x400 each. They result being about 20/25 kB each when stored. When I try to use my current way of generating and storing, I get a memory warning and I see (using Instruments) that the number of alive CGImage and UIImage objects keeps increasing rapidly. This seems like they're being retained but I don't hold any references to them.
Here is my other question closer detailing the code I'm using: UIGraphicsBeginImageContext created image
What is the best way to create and store to secondary storage this many images? Thanks in advance.
Edit:
Here's the whole code I currently use to create and save the images:
//==========================================================
// Definitions and Macros
//==========================================================
//HEX color macro
#define UIColorFromRGB(rgbValue) [UIColor \
colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
//Colours
#define RED_COLOUR UIColorFromRGB(0xF65D58)
#define ORANGE_COLOUR UIColorFromRGB(0xFF8D16)
#define YELLOW_COLOUR UIColorFromRGB(0xFFD100)
#define LIGHT_GREEN_COLOUR UIColorFromRGB(0x82DE13)
#define DARK_GREEN_COLOUR UIColorFromRGB(0x67B74F)
#define TURQUOISE_COLOUR UIColorFromRGB(0x32ADA6)
#define LIGHT_BLUE_COLOUR UIColorFromRGB(0x11C9FF)
#define DARK_BLUE_COLOUR UIColorFromRGB(0x2E97F5)
#define PURPLE_COLOUR UIColorFromRGB(0x8F73FD)
#define PINK_COLOUR UIColorFromRGB(0xF35991)
#import "ViewController.h"
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
//Generate the graphics
[self generateAndSaveGraphics];
}
//==========================================================
// Generating and Saving Graphics
//==========================================================
-(void)generateAndSaveGraphics {
dispatch_async( dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self createAvatarImages];
//Here create all other images that need to be saved to Cache directory
dispatch_async( dispatch_get_main_queue(), ^{ //Finished
NSLog(#"DONE"); //always runs out of memory before getting here
});
});
}
-(void)createAvatarImages {
//Create avatar images
NSArray *colours = [NSArray arrayWithObjects:RED_COLOUR, ORANGE_COLOUR, YELLOW_COLOUR, LIGHT_GREEN_COLOUR, DARK_GREEN_COLOUR, TURQUOISE_COLOUR, LIGHT_BLUE_COLOUR, DARK_BLUE_COLOUR, PURPLE_COLOUR, PINK_COLOUR, nil];
NSString *cacheDir = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) lastObject];
for(int i = 0; i < 32; i++) { //Avatar image templates are named m1 - m16 and f1 - f16
NSString *avatarImageName;
if(i < 16) { //female avatars
avatarImageName = [NSString stringWithFormat:#"f%i", i+1];
}
else { //male avatars
avatarImageName = [NSString stringWithFormat:#"m%i", i-15];
}
for(int j = 0; j < colours.count; j++) { //make avatar image for each colour
#autoreleasepool { //only helps very slightly
UIColor *colour = [colours objectAtIndex:j];
UIImage *avatarImage = [self tintedImageFromImage:[UIImage imageNamed:avatarImageName] colour:colour intensity:0.3];
NSString *fileName = [NSString stringWithFormat:#"%#_%i.png", avatarImageName, j];
NSString *filePath = [cacheDir stringByAppendingPathComponent:fileName];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(avatarImage)];
[imageData writeToFile:filePath atomically:YES];
NSLog(#"AVATAR IMAGE CREATED");
}
}
}
}
//==========================================================
// Universal Image Tinting Code
//==========================================================
//Creates a tinted image based on the source greyscale image and tinting intensity
-(UIImage *)tintedImageFromImage:(UIImage *)sourceImage colour:(UIColor *)color intensity:(float)intensity {
if (UIGraphicsBeginImageContextWithOptions != NULL) {
UIGraphicsBeginImageContextWithOptions(sourceImage.size, NO, 0.0);
} else {
UIGraphicsBeginImageContext(sourceImage.size);
}
CGContextRef context = UIGraphicsGetCurrentContext();
CGRect rect = CGRectMake(0, 0, sourceImage.size.width, sourceImage.size.height);
// draw alpha-mask
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextDrawImage(context, rect, sourceImage.CGImage);
// draw tint color, preserving alpha values of original image
CGContextSetBlendMode(context, kCGBlendModeSourceIn);
[color setFill];
CGContextFillRect(context, rect);
//Set the original greyscale template as the overlay of the new image
sourceImage = [self verticallyFlipImage:sourceImage];
[sourceImage drawInRect:CGRectMake(0,0, sourceImage.size.width,sourceImage.size.height) blendMode:kCGBlendModeMultiply alpha:intensity];
UIImage *colouredImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
colouredImage = [self verticallyFlipImage:colouredImage];
return colouredImage;
}
//Vertically flips an image
-(UIImage *)verticallyFlipImage:(UIImage *)originalImage {
UIImageView *tempImageView = [[UIImageView alloc] initWithImage:originalImage];
UIGraphicsBeginImageContext(tempImageView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, tempImageView.frame.size.height);
CGContextConcatCTM(context, flipVertical);
[tempImageView.layer renderInContext:context];
UIImage *flippedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return flippedImage;
}
#end
I've created a test project (in the zip) to illustrate the problem:
Project Files
For future reference, the solution is this one line of code:
tempImageView.image = nil;
Thanks to Matic.
It would seem that the issue is in method verticallyFlipImage. The graphics context seems to retain the temporary image view you create and with it the image you assign. This issue would probably be generally fixed by pushing each image through the process as its own dispatch call: Resample image -> callback -> resample next (or exit).
In the end of the whole resampling all the data is released and there is no memory leak. To make a quick fix you can simply call tempImageView.image = nil; before returning the image. The image view itself still produces a memory inflate but it is too small to have any impact.
This works for me and I hope it helps you.
EDIT: added the dispatch concept (comment reference)
dispatch_queue_t internalQueue;
- (void)createQueue {
dispatch_sync(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^(void) {
internalQueue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL); //we created a high priority queue
});
}
- (void)deleteQueue {
dispatch_release(internalQueue);
}
- (void)imageProcessingDone {
[self deleteQueue];
//all done here
}
- (void)processImagesInArray:(NSMutableArray *)imageArray {
//take out 1 of the objects (last in this case, you can do objectAtIndex:0 if you wish)
UIImage *img = [[imageArray lastObject] retain]; //note, image retained so the next line does not deallocate it (released at NOTE1)
[imageArray removeLastObject]; //remove from the array
dispatch_async(internalQueue, ^(void) { //dispach
//do all the image processing + saving
[img release];//NOTE1
//callback: In this case I push it the main thread. There should be little difference if you simply dispach it again on the internalQueue
if(imageArray.count > 0) {
[self performSelectorOnMainThread:#selector(processImagesInArray:) withObject:imageArray waitUntilDone:NO];
}
else {
[self performSelectorOnMainThread:#selector(imageProcessingDone) withObject:nil waitUntilDone:NO];
}
});
}

iOS Simulator resetting an image iteratively

I'm currently developing an app that will reset an image iteratively, and the future images can not be stored or predicted. For simple tests, I used
UIImage * image = [UIImage imageNamed:#"mountain.jpg"];
image = [ /* function that changes image */ ];
self.imageView.image = image
and this worked fine. The imageView updated, just like I wanted. But when I do
for (i=0; i<10; i++){
image = [ /* function that changes image */ ];
self.imageView.image = image;
}
it doesn't work. So I tried this:
for (int i=0; i<10; i++) {
NSLog(#"%d", i);
[NSThread sleepForTimeInterval:1.0];
image = [self.brain recontructImage:image iterations:10];
self.imageView.image = image;
[self.imageView setNeedsDisplay];
[self.imageView setImage:image];
}
and flipped it so the [self.image setNeedsDisplay] and [self.image setImage:image] where before the image = [ /* function that changes image */ ].
I've only tried this in the simulator, and I'm beginning to wonder if it's broken, but I doubt it. Maybe it's an issue with timing, but I paused for a second. What am I doing wrong?

How can I make blind down effect to an image in IOS?

I want that, when I roll the iPad, the image blinds up/down. Effect should be like
http://madrobby.github.com/scriptaculous/combination-effects-demo/ Blind Down demo.
How can I do that?
I tried Reflection example of Apple but I had performance issues since I should redraw image in every gyroscope action.
Here is the Code:
- (void)viewDidLoad
{
[super viewDidLoad];
tmp = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"galata2.jpg"]];
// Do any additional setup after loading the view, typically from a nib.
NSUInteger reflectionHeight = imageView1.bounds.size.height * 1;
imageView1 = [[UIImageView alloc] init];
imageView1.image = [UIImage imageNamed:#"galata1.jpg"];
[imageView1 sizeToFit];
[self.view addSubview:imageView1];
imageView2 = [[UIImageView alloc] init];
//UIImageView *tmp = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"galata2.jpg"]];
imageView2.image = [UIImage imageNamed:#"galata2.jpg"];
[imageView2 sizeToFit];
[self.view addSubview:imageView2];
motionManager = [[CMMotionManager alloc] init];
motionManager.gyroUpdateInterval = 1.0/10.0;
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error){
[self performSelectorOnMainThread:#selector(handleDeviceMotion:) withObject:motion waitUntilDone:YES];
}];
}
////
- (void)handleDeviceMotion:(CMDeviceMotion*)motion{
CMAttitude *attitude = motion.attitude;
int rotateAngle = abs((int)degrees(attitude.roll));
//CMRotationRate rotationRate = motion.rotationRate;
NSLog(#"rotation rate = [Pitch: %f, Roll: %d, Yaw: %f]", degrees(attitude.pitch), abs((int)degrees(attitude.roll)), degrees(attitude.yaw));
int section = (int)(rotateAngle / 30);
int x = rotateAngle % 30;
NSUInteger reflectionHeight = (1024/30)*x;
NSLog(#"[x = %d]", reflectionHeight);
imageView2.image = [self reflectedImage:tmp withHeight:reflectionHeight];
}
////
- (UIImage *)reflectedImage:(UIImageView *)fromImage withHeight:(NSUInteger)height
{
if(height == 0)
return nil;
// create a bitmap graphics context the size of the image
CGContextRef mainViewContentContext = MyCreateBitmapContext(fromImage.bounds.size.width, fromImage.bounds.size.height);
// create a 2 bit CGImage containing a gradient that will be used for masking the
// main view content to create the 'fade' of the reflection. The CGImageCreateWithMask
// function will stretch the bitmap image as required, so we can create a 1 pixel wide gradient
CGImageRef gradientMaskImage = CreateGradientImage(1, kImageHeight);
// create an image by masking the bitmap of the mainView content with the gradient view
// then release the pre-masked content bitmap and the gradient bitmap
CGContextClipToMask(mainViewContentContext, CGRectMake(0.0, 0.0, fromImage.bounds.size.width,height), gradientMaskImage);
CGImageRelease(gradientMaskImage);
// In order to grab the part of the image that we want to render, we move the context origin to the
// height of the image that we want to capture, then we flip the context so that the image draws upside down.
//CGContextTranslateCTM(mainViewContentContext, 0.0,0.0);
//CGContextScaleCTM(mainViewContentContext, 1.0, -1.0);
// draw the image into the bitmap context
CGContextDrawImage(mainViewContentContext, CGRectMake(0, 0, fromImage.bounds.size.width, fromImage.bounds.size.height), fromImage.image.CGImage);
// create CGImageRef of the main view bitmap content, and then release that bitmap context
CGImageRef reflectionImage = CGBitmapContextCreateImage(mainViewContentContext);
CGContextRelease(mainViewContentContext);
// convert the finished reflection image to a UIImage
UIImage *theImage = [UIImage imageWithCGImage:reflectionImage];
// image is retained by the property setting above, so we can release the original
CGImageRelease(reflectionImage);
return theImage;
}
One way to do this is to use another covering view that gradually changes height by animation;
If you have a view called theView that you want to cover, try something like this to reveal theView underneath a cover view:
UIView *coverView = [UIView alloc] initWithFrame:theView.frame];
coverView.backgroundcolor = [UIColor whiteColor];
[theView.superView addSubView:coverView]; // this covers theView, adding it to the same view that the view is contained in;
CGRect newFrame = theView.frame;
newFrame.size.height = 0;
newFrame.origin.y = theView.origin.y + theView.size.height;
[UIView animateWithDuration:1.5
delay: 0.0
options: UIViewAnimationOptionRepeat
animations:^{
coverView.frame = newFrame;
}
completion:nil
];
This should cover the view and then reveal it by changing the frame ov the cover, moving it down while changing the height.
I haven't tried the code, but this is one direction you can take to create the blind effect. I have used similar code often, and it is very easy to work with. Also, it doesn't require knowing core animation.

Resources