I'm trying to resize down a GIF image in iOS, so I'm looping thru each frame as you can see in the code below.
NSMutableArray *animationImages = [NSArray arrayWithArray:sourceImage.images];
NSMutableArray *newAnimationImages = [[NSMutableArray alloc] init];
//#autoreleasepool {
UIGraphicsBeginImageContext(CGSizeMake(newWidth, newHeight));
for(UIImage *frame in animationImages) {
[frame drawInRect:CGRectMake(0, 0, newWidth, newHeight)];
UIImage *newFrame = UIGraphicsGetImageFromCurrentImageContext();
[newAnimationImages addObject:newFrame];
}
UIGraphicsEndImageContext();
//}
newImage = [UIImage animatedImageWithImages:newAnimationImages duration:sourceImage.duration];
Unfortunately, for some reason, it's taking ages to resize (around 3-4 seconds for a 2-4 second gif)
Any ideas what I'm doing wrong?
Quick update - I ended up using the Aspect Fit view option for a UIImageView, that did exactly what I needed, then did the actual resizing server side.
Related
I have the below implementation to save a UIView as an image to the device's photo album. It works correctly, however the saved image adapts to the device's screen resolution. As an instance if I run it on a iPhone 5, the saved image will be 640 x 640 px. My goal is to save custom sized images like 1800 x 1800 px or something like that on every device. So I would really appreciate if somebody could give me an example or any guidance, that helps me to find the right solution. Any other tips welcomed, my goal is to make custom pixel sized images, it doesn't matter if I have to use a different implementation.
- (IBAction)saveImg:(id)sender {
UIImage *imageToSave = [[UIImage alloc] init];
// self.fullVw holds the image, that I want to save
imageToSave = [self.fullVw pb_takeSnapshot];
NSData *pngData = UIImagePNGRepresentation(imageToSave);
UIImage *imageToSave2 = [[UIImage alloc] init];
imageToSave2 = [UIImage imageWithData:pngData];
UIImageWriteToSavedPhotosAlbum(imageToSave2,nil, nil, nil);
}
// This method is in a UIView category
- (UIImage *)pb_takeSnapshot {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, self.opaque, 0.0);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I have animation of images and some images are different widths and heights. What I am looking to do is set the width and height to each of theses images.
jumperguy.animationImages = [NSArray arrayWithObjects:
[UIImage imageNamed:#"jumperguy_a_1.png"],
[UIImage imageNamed:#"jumperguy_a_2.png"],
[UIImage imageNamed:#"jumperguy_a_3.png"],
[UIImage imageNamed:#"jumperguy_a_4"],nil];
[jumperguy setAnimationRepeatCount:1];
jumperguy.animationDuration = 1;
[jumperguy startAnimating];
UIImageView animation does a "flip book" style animation where each image is drawn into the same frame. I don't believe it will handle images of different sizes between frames. As Wain suggests, you should scale your images to fit in the same frame in an image editor before putting them into your app.
As I mentioned in my comment, you can programmatically scale each image in your array like so:
...
CGFloat width = whateverWidth;
CGFloat height = whateverHeight;
jumperguy.animationImages = [NSArray arrayWithObjects:
[self imageWithImage:[UIImage imageNamed:#"jumperguy_a_1.png"] scaledToSize:CGSizeMake(width, height)],
[self imageWithImage:[UIImage imageNamed:#"jumperguy_a_2.png"] scaledToSize:CGSizeMake(width, height)],
[self imageWithImage:[UIImage imageNamed:#"jumperguy_a_3.png"] scaledToSize:CGSizeMake(width, height)],
[self imageWithImage:[UIImage imageNamed:#"jumperguy_a_4.png"] scaledToSize:CGSizeMake(width, height)],nil];
[jumperguy setAnimationRepeatCount:1];
jumperguy.animationDuration = 1;
[jumperguy startAnimating];
}
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
//UIGraphicsBeginImageContext(newSize);
// In next line, pass 0.0 to use the current device's pixel scaling factor (and thus account for Retina resolution).
// Pass 1.0 to force exact pixel size.
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I have a NSMutableArray that contain several UIImage objects, these objects contain different dimensions and they all need to have the width of 140, but that will make the images really awful in my UICollectionView.
Is there a method where i put in the UIImage and then set the width 140 and then get the new height, which i need for the cell height?
Working code:
//create image array
UIImage *image = [UIImage imageNamed:#"yourimagename.png"];
NSArray *imageArray = [NSArray arrayWithObjects:image, nil];
//get image original height, original width...
CGFloat originalHeight = image.size.height;
CGFloat originalWidth = image.size.width;
//set wanted width and calculate wanted height...
CGFloat wantedWidth = 140;
CGFloat wantedHeight = (originalHeight*wantedWidth)/originalWidth;
//call resize image method... setting wanted height and width....
UIImage *avatarImage = [self imageWithImage:[imageArray objectAtIndex:0] scaledToSize:CGSizeMake(wantedWidth, wantedHeight)];
//create imageView.. ... .
UIImageView *avatarImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 100, wantedWidth, wantedHeight)];
[avatarImageView setImage:avatarImage];
[self.view addSubview:avatarImageView];
Resize image method.
- (UIImage *)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I have an image that can be stretched in the horizontal direction, but need to be tiled vertically (one on top of the other) to fill my view. How could I do this?
I know that an image can be made resizable by using the -(UIImage *)resizableImageWithCapInsets:(UIEdgeInsets)capInsets method. So that works great for the horizontal direction but that method cannot be used for the vertical direction it really needs to be tiled, and not stretched in the vertical direction to work.
Now my idea of how this could work is to run a loop that creates a UIImageView with the horizontally stretched image, and simply adjust the frame.origin.y property, then add it as a subview each loop until I've gone past the height of the view. However this seems like an overly complex way of doing this and really not practical when the view need to be resized (on an iPad rotation for example)
Is there a simpler more efficient way of doing this on iOS 6.x?
I used this to tile an image horizontally in a UIImageView:
UIImage *image = [UIImage imageNamed:#"my_image"];
UIImage *tiledImage = [image resizableImageWithCapInsets:UIEdgeInsetsMake(0, 0, 0, 0) resizingMode:UIImageResizingModeTile];
UIImageView *imageView = [[UIImageView alloc] initWithImage:tiledImage];
This assumes you set the frame of the UIImageView to a larger size than your image asset. If the height is taller than your image, it will also tile vertically.
Have you considered using UIColor's method colorWithPatternImage: to create a repeating pattern and just pass an image with the correct horizontal width?
Code Example:
// On your desired View Controller
- (void)viewDidLoad
{
[super viewDidLoad];
UIImage *patternImage = [UIImage imageNamed:#"repeating_pattern"];
self.view.backgroundColor = [UIColor colorWithPatternImage:patternImage];
// ... do your other stuff here...
}
UIImage *image = [UIImage imageNamed:#"test.png"];
UIImage *resizableImage = [image resizableImageWithCapInsets:UIEdgeInsetsMake(0, image.size.width / 2, 0, image.size.width / 2)];
So I've figured out a solution for this. It is I think a little jacky but it works.
Basically what I've done was to create my stretchable image with the left and right caps specified. And then I initialize a UIImageView with it. I then adjust the frame of the image view to the desired width. This will appropriately resize the image that is contained within it.
Then finally I used a piece of code I found that creates a new image by vertically tiling the adjusted image that is now contained in the image view. Notice how instead of accessing the original UIImage I am using the UIImageView view's image property.
The last thing then is to set the new UIImage as the patterned background colour of the view
Here is the code:
// create a strechable image
UIImage *image = [[UIImage imageNamed:#"progressBackgroundTop#2x.png"] resizableImageWithCapInsets:UIEdgeInsetsMake(0, 60, 0, 60)];
// create the image view that will contain it
UIImageView *imageView = [[UIImageView alloc] initWithImage:image];
CGRect frame = imageView.frame;
frame.size.width = self.view.bounds.size.width;
imageView.frame = frame;
// create the tiled image
CGSize imageViewSize = imageView.bounds.size;
UIGraphicsBeginImageContext(imageViewSize);
CGContextRef imageContext = UIGraphicsGetCurrentContext();
CGContextDrawTiledImage(imageContext, (CGRect){ CGPointZero, imageViewSize }, imageView.image.CGImage);
UIImage *finishedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.view.backgroundColor = [UIColor colorWithPatternImage:finishedImage];
I know how to render a normal UIView to a bitmap image:
UIGraphicsBeginImageContextWithOptions(view.bounds.size, NO, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *bitmapImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
The problem is that if view is a UIImageView with a stretched resizable image returned from resizableImageWithCapInsets:, bitmapImage gotten this way is the original image — the un-stretched one — instead of the really displayed stretched image. I can easily tell that by the size difference between bitmapImage and view. So my question is how to render the full content displayed by a UIImageView whose image is a stretched resizable image?
So, I tried this piece of code:
UIEdgeInsets edgeInsets = UIEdgeInsetsMake(-20, -20, -20, -20);
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 300, 300)];
imageView.image = [[UIImage imageNamed:#"number1.png"] resizableImageWithCapInsets:edgeInsets];
[self.view addSubview:imageView];
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, NO, 0.0);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *bmImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSArray *documentsPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDir = [documentsPaths objectAtIndex:0];
[UIImagePNGRepresentation(bmImage) writeToFile:[documentsDir stringByAppendingString:#"num1.png"] atomically:YES];
and got this:
and from this:
So it seems like the code works as it is supposed to.
I'm an idiot! The code is perfectly working. I just messed up something somewhere else. The stupid thing is that I put the code in a method of my UIView category but I named the method image. It works perfectly for general UIViews, but as UIImageView has its own image method and UIImageView is a subclass of UIView, UIImageView's image method is called when I tried to get my bitmap image.