Merge multiple images into one image in iPhone [closed] - ios

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I want to merge multiple images which are of different size & at different points.
I want to merge them all and save it in one copy(image).
so how can i mearge images into one image?

I found the solution which is so simple
you can merge multiple images by creating following method
- (BOOL) mergedImageOnMainImage:(UIImage *)mainImg WithImageArray:(NSArray *)imgArray AndImagePointArray:(NSArray *)imgPointArray
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIGraphicsBeginImageContext(mainImg.size);
[mainImg drawInRect:CGRectMake(0, 0, mainImg.size.width, mainImg.size.height)];
int i = 0;
for (UIImage *img in imgArray) {
[img drawInRect:CGRectMake([[imgPointArray objectAtIndex:i] floatValue],
[[imgPointArray objectAtIndex:i+1] floatValue],
img.size.width,
img.size.height)];
i+=2;
}
CGImageRef NewMergeImg = CGImageCreateWithImageInRect(UIGraphicsGetImageFromCurrentImageContext().CGImage,
CGRectMake(0, 0, mainImg.size.width, mainImg.size.height));
UIGraphicsEndImageContext();
[pool release];
if (NewMergeImg == nil) {
return NO;
}
else {
UIImageWriteToSavedPhotosAlbum([UIImage imageWithCGImage:NewMergeImg], self, nil, nil);
return YES;
}
}
now call this method in follwing way
NSArray *imgArray = [[NSArray alloc] initWithObjects:
[UIImage imageNamed:#"image06.png"],
[UIImage imageNamed:#"image07.png"],
[UIImage imageNamed:#"image08.png"],
[UIImage imageNamed:#"image09.png"],
[UIImage imageNamed:#"BackBtn.png"],
[UIImage imageNamed:#"Facebook.png"], nil];
NSArray *imgPointArray = [[NSArray alloc] initWithObjects:
#"10", #"10",
#"10", #"25",
#"30", #"15",
#"30", #"50",
#"20", #"80",
#"25", #"100", nil];
BOOL suc = [self mergedImageOnMainImage:[UIImage imageNamed:#"img001.png"] WithImageArray:imgArray AndImagePointArray:imgPointArray];
if (suc == YES) {
NSLog(#"Images Successfully Mearged & Saved to Album");
}
else {
NSLog(#"Images not Mearged & not Saved to Album");
}

You can use CIImage for this,
-(UIImage *)mergeTwoImage:(UIImage*)image1 andImage2:(UIImage *)image2
{
CIImage *topImage = [[CIImage alloc]initWithImage:image1];
CIImage *backgroundImage = [[CIImage alloc]initWithImage:image2];
CIFilter *darkenFilter = [CIFilter filterWithName:#"CIDarkenBlendMode" keysAndValues:kCIInputImageKey,topImage,
#"inputBackgroundImage",backgroundImage,nil];
CIImage *resultOfFilter = darkenFilter.outputImage;
CIContext *ctx = [CIContext contextWithOptions:nil];
CGImageRef imageToReturn = [ctx createCGImage:resultOfFilter fromRect:resultOfFilter.extent];
UIImage *outputImage = [UIImage imageWithCGImage:imageToReturn];
CGImageRelease(imageToReturn);
imageToReturn = nil;
return outputImage;
}

You can use this code : -
- (IBAction) mergeImage:(id)sender
{
UIImage *framedImage = firstImageView.image;
CGSize size = CGSizeMake(firstImageView.frame.size.width, firstImageView.frame.size.height);
UIGraphicsBeginImageContextWithOptions(size, self.view.alpha, 0.0);
UIImage *imageToPlace = secondImageView.image;
[framedImage drawInRect:CGRectMake(0, 0, firstImageView.frame.size.width, firstImageView.frame.size.height)];
[imageToPlace drawInRect:CGRectMake(18, 25, 242, 243)];
UIImage *imageC = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageView *newImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, framedImage.size.width,framedImage.size.height)];
newImageView.image = imageC;
newImageView.contentMode = UIViewContentModeScaleAspectFill;
}
This is for 2 images they merge two UIImageViews into one. Same this for multiple images. You have to merge multiple images according to your size.

Related

Issue using displaying Image with NSMutableArray

I am very new to xcode and I am playing around with image arrays and NSMutableArray. When I run the code below it just print out the names of the image for example "image1.png". Any tips to fix this problem would be appreciated. Thanks.
- (NSMutableArray*)restaurantArray;
{
if (_restaurantArray == nil) {
_restaurantArray = [[NSMutableArray alloc] initWithObjects:
#"image1.png",
#"image2.png",
#"image3.png",
nil];
}
return _restaurantArray;
}
-(NSString*) randomRestaurant
{
int random = arc4random_uniform(self.restaurantArray.count);
return [self.restaurantArray objectAtIndex:random];
}
You could do it in this way also.
- (NSMutableArray*)restaurantArray;
{
if (_restaurantArray == nil) {
_restaurantArray = [[NSMutableArray alloc] initWithObjects:[UIImage imageNamed:#"image1.png"],
[UIImage imageNamed:#"image2.png"],
[UIImage imageNamed:#"image3.png"],
nil];
}
return _restaurantArray;
}
Do you mean this?:
NSString *imageName = [self randomRestaurant];
UIImage *image = [UIImage imageName:imageName];
If you want to display the image then do this:
Suppose you have an imageview name imgView,
imgView.image = [UIImage imageNamed:[self randomRestaurant]];
Or, if you just want the image, then call this:
UIImage *img = [UIImage imageNamed:[self randomRestaurant]];
In you given code, you are just returning the name of the image, not the actual image. If you want an array of image then you can do it by this way:
-(NSMutableArray*) restaurantArray;
{
if (_restaurantArray == nil) {
UIImage *img = [UIImage imageNamed:#"image1.png"];
UIImage *img1 = [UIImage imageNamed:#"image2.png"];
UIImage *img2 = [UIImage imageNamed:#"image3.png"];
_restaurantArray = [[NSMutableArray alloc] initWithObjects:
img,
img1,
img2,
nil];
}
return _restaurantArray;
}
-(UIImage*) randomRestaurant
{
int random = arc4random_uniform(self.restaurantArray.count);
return [self.restaurantArray objectAtIndex:random];
}
Hope this helps.. :)

Reloading UICollectionView With UIImages not Releasing Memory

I use the following code to create borders around images. I loop through the following multiple times to create thumbnail images with borders that are then placed in a UICollectionView.
The problem seems to be that every time I reload the UICollectionView the images are not being released from memory and it seems to build up to the point where it crashes. I don't think it is the UICollectionView code because If I run it with images that don't require a border I don't get any issues.
- (UIImage *)applyFrame:(UIImage *)image selectedFrame:(NSString *)selectedFrame {
UIImage *frame;
NSLog(#"Image Size For Frames: %f, %f",image.size.width,image.size.height);
if(image.size.width == image.size.height){
frame = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"%#-Square.png",selectedFrame] ofType:nil]];
}
if(image.size.width > image.size.height){
frame = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"%#.png",selectedFrame] ofType:nil]];
}
if(image.size.width < image.size.height){
frame = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"%#.png",selectedFrame] ofType:nil]];
frame = [self rotateUIImage:frame clockwise:true];
}
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
GPUImagePicture *imageToProcess = [[GPUImagePicture alloc] initWithImage:image];
GPUImagePicture *border = [[GPUImagePicture alloc] initWithImage:frame];
blendFilter.mix = 1.0f;
[imageToProcess addTarget:blendFilter];
[border processImage];
[border addTarget:blendFilter];
[imageToProcess processImage];
return [blendFilter imageFromCurrentlyProcessedOutput];
}
- (UIImage*)rotateUIImage:(UIImage*)sourceImage clockwise:(BOOL)clockwise {
CGSize size = sourceImage.size;
UIGraphicsBeginImageContext(CGSizeMake(size.height, size.width));
[[UIImage imageWithCGImage:[sourceImage CGImage] scale:1.0 orientation:clockwise ? UIImageOrientationRight : UIImageOrientationLeft] drawInRect:CGRectMake(0,0,size.height ,size.width)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
UIImage imageWithContentsOfFile: returns an autorelease object, which will be released only when the app returns to the current run loop. So, might it be that you app loops for a long time executing this code? In this case, the autorelease objects might accumulate, and increase the heap usage.
If so, you could try to insert into the body of the responsible loop an autorelease block:
#autoreleasepool {
...your statements
}

Can't Capture Full Screen of iCarousel

I have a problem to capture full screen of iCarousel. it can capture only index of Carousel only .
UIGraphicsBeginImageContext(caputureView.bounds.size);
[caputureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Try something like this:
- (void) getFullScreenScreenShot
{
AppDelegate* appDelegate = (AppDelegate*)[[UIApplication sharedApplication] delegate];
UIView* superView = appDelegate.viewController.view;
CGRect fullScreenFrame = superView.frame;
UIGraphicsBeginImageContextWithOptions(fullScreenFrame.size, YES, 0.0f);
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), 0.0f, 0.0f);
[superView.layer renderInContext: UIGraphicsGetCurrentContext()];
UIImageView* screenShot = [[UIImageView alloc] initWithImage: UIGraphicsGetImageFromCurrentImageContext()];
UIGraphicsEndImageContext();
NSData* imageData = UIImageJPEGRepresentation(screenShot.image, 1.0);
NSString* previewFileNamePath = [[CPFileManager documentsPath] stringByAppendingString: #"image.jpg"];
if ([imageData writeToFile: previewFileNamePath
atomically: NO])
{
NSLog(#"See filename:%#", previewFileNamePath);
}
else
{
NSLog(#"Error: %#", previewFileNamePath);
}
}

iOS - Setting blurred image on top of other views, odd issues

So, I've got an odd scenario.
In my iOS app, I'm trying to blur the content area of the screen when a popover is opened.
I have this working when using Core Image, but only when using Gaussian blur- none of the other blurs work, which is odd.
I tried doing the same with GPUImage, and it blurs far faster, but doesn't actually put the view on top of the other views!
To summarize: in the source below, setBlurOnView will work properly- however setBlurOnViewWithGPUImage appears to not be working. The blur view (tag 6110) is created, but the app doesn't actually blur.
Note: This is on iOS 6, in the simulator.
Here's the relevant source:
// ScreenBlur.m
#import <QuartzCore/QuartzCore.h>
#import <CoreImage/CoreImage.h>
#import <GPUImage/GPUImage.h>
#import "ScreenBlur.h"
#import "GlobalData.h"
#import "Logger.h"
#implementation ScreenBlur
+ (void) setBlurOnViewWithGPUImage:(UIView*)view {
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:[self captureScreenInRect:view.frame inView:view]];
GPUImageGaussianBlurFilter *blur = [[GPUImageGaussianBlurFilter alloc] init];
[imageSource addTarget:blur];
[imageSource processImage];
[self setImage:[imageSource imageFromCurrentlyProcessedOutput] toView:view];
}
+ (void) setBlurOnView:(UIView *)view {
//http://stackoverflow.com/questions/17041669/creating-a-blurring-overlay-view
CIImage *inputImage = [CIImage imageWithCGImage:[self captureScreenInRect:view.frame inView:view].CGImage];
//CIContext *context = [CIContext contextWithOptions:nil];
if ([GlobalData getInstance].ciContext == nil) {
[Logger Log:#"ciContext does not exist, creating..." fromClass:#"ScreenBlur"];
// [GlobalData getInstance].ciContext = [CIContext contextWithOptions:nil]; //cpu context
[GlobalData getInstance].ciContext = [CIContext contextWithEAGLContext:[[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]];
}
//set up the blur filter
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:3.0f] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
// CIGaussianBlur has a tendency to shrink the image a little,
// this ensures it matches up exactly to the bounds of our original image
CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
[self setImage:[UIImage imageWithCGImage:cgImage] toView:view];
}
+ (void) setImage:(UIImage*)blurredImage toView:(UIView*)view {
UIView *blurView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, blurredImage.size.width, blurredImage.size.height)];
[blurView setBackgroundColor:[UIColor colorWithPatternImage:blurredImage]];
[blurView setTag:6110];
//set the image as the foreground for the view
[view addSubview:blurView];
[view bringSubviewToFront:blurView];
}
//same as the method above, but resizes the screenshot before applying the blur for increased performance at the expense of image quality.
+ (void) setBlurOnViewPerformanceMode:(UIView *)view {
//http://stackoverflow.com/questions/17041669/creating-a-blurring-overlay-view
UIImage *screenShot = [self imageWithImage:[self captureScreenInRect:view.frame inView:view] scaledToSize:CGSizeMake(view.frame.size.width / 2, view.frame.size.height / 2)];
CIImage *inputImage = [CIImage imageWithCGImage:screenShot.CGImage];
//CIContext *context = [CIContext contextWithOptions:nil];
if ([GlobalData getInstance].ciContext == nil) {
[Logger Log:#"ciContext does not exist, creating..." fromClass:#"ScreenBlur"];
// [GlobalData getInstance].ciContext = [CIContext contextWithOptions:nil]; //cpu context
[GlobalData getInstance].ciContext = [CIContext contextWithEAGLContext:[[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]];
}
//set up the blur filter
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:3.0f] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
//CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
[self setImage:[self imageWithImage:[UIImage imageWithCGImage:cgImage] scaledToSize:view.frame.size] toView:view];
}
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
//UIGraphicsBeginImageContext(newSize);
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
+ (void) removeBlurFromView:(UIView *)view {
for (UIView *subView in view.subviews) {
if (subView.tag == 6110) {
[subView removeFromSuperview];
}
}
}
+ (UIImage *)captureScreenInRect:(CGRect)captureFrame inView:(UIView*) view {
CALayer *layer;
layer = view.layer;
UIGraphicsBeginImageContext(view.bounds.size);
CGContextClipToRect (UIGraphicsGetCurrentContext(),captureFrame);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenImage;
}
#end
And then in my view controller, it's simply called with
[ScreenBlur setBlurOnView:self.view];
I found a workaround for this (or, who knows, maybe this is how it was supposed to be done).
//ScreenBlur.m
+ (GPUImageView*) getBlurredImageWithGPUImageFromView:(UIView*)view {
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:[self captureScreenInRect:view.frame inView:view] smoothlyScaleOutput:true];
GPUImageFastBlurFilter *blur = [[GPUImageFastBlurFilter alloc] init];
[blur setBlurPasses:3];
[imageSource addTarget:blur];
GPUImageView *filteredView = [[GPUImageView alloc] initWithFrame:view.frame];
[blur addTarget:filteredView];
[imageSource processImage];
return filteredView;
}
//ViewController.m
//blur the main screen
GPUImageView *blurred = [ScreenBlur getBlurredImageWithGPUImageFromView:self.view];
[blurred setTag:6110];
[self.view addSubview:blurred];
[self.view bringSubviewToFront:blurred];

iOS: cant view downloaded images

I'm a novice in this, so i would like to know what is missing in my code to download/show these images:
self.carView.contentSize = CGSizeMake(280*self.carImages.count-280, 200);
self.carView.pagingEnabled = YES;
CGRect imageRect = CGRectMake(0.0, 0.0,280*self.carImages.count-280, 200);
UIGraphicsBeginImageContext(imageRect.size);
UIImageView *carImg = [[UIImageView alloc] init];
for (int i = 1; i < self.carImages.count; i++) {
NSData *imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:[self.carImages objectAtIndex:i]]];
UIImage *carImage = [[UIImage alloc] initWithData:imageData];
[carImage drawAtPoint:CGPointMake(280*(i-1), 0)];
[carImage release];
[imageData release];
NSLog(#"%d",i);
}
carImg.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self.carView addSubview:carImg];
[carImg release];
carView is my ScrollView.
carImages is a MutableArray with the "URL" of the images.
the pagingEnabled and contentSize are being set, but no images.
I bet i'm missing something very stupid, as i allways do.

Resources