Possible duplicate : Saving two Overlapping UIImage
In a UIScrollView, i've added my UIImageView and there's a frame Layer above it. I want to save the final Image on screen after all editing. Answer given to the above mentioned question does the work but as it's drawing the frame, results in degradation of quality of my image , So i'm searching for a solution which keeps my image in good resolution.
Please help me out of this, thanks in advance !
Try with this code:
//merge two images for this code
UIImage *bottomImage =imgview.image; //background image ////1st image
UIImage *image = imgProfile.image; //foreground image///2nd image
CGSize newSize = CGSizeMake(270, 330); // set your image rect
UIGraphicsBeginImageContext( newSize );
// Use existing opacity as is
[bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];///1st image set frame
// Apply supplied opacity if applicable
[image drawInRect:CGRectMake(81,218,97,78) blendMode:kCGBlendModeNormal alpha:1]; //2nd image set frame on bottom image with alpha value
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:#"savedImage.png"];
NSData *imageData = UIImagePNGRepresentation(newImage);
[imageData writeToFile:savedImagePath atomically:NO];
You can see your newly created image in your application's document directory.
Related
I have a UILabel of size 180 x 180 and I am converting it in the image, but when I am saving it I am getting 540 x 540 image resolution, I need 180 x 180. Here is my code.
- (void)grabImage {
// Create a "canvas" (image context) to draw in.
UIGraphicsBeginImageContextWithOptions(_lbl1.bounds.size, _lbl1.opaque, 0.0); // high res
// Make the CALayer to draw in our "canvas".
[[_lbl1 layer] renderInContext: UIGraphicsGetCurrentContext()];
// Fetch an UIImage of our "canvas".
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
// Stop the "canvas" from accepting any input.
UIGraphicsEndImageContext();
NSData *pngData = UIImagePNGRepresentation(image);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory
NSString *filePath = [documentsPath stringByAppendingPathComponent:#"image180.png"]; //Add the file name
[pngData writeToFile:filePath atomically:YES]; //Write the file
}
The quality is poor, in attached image upper one is original UILable and at bottom is the generated image, see the difference
get image from uimageview and save in app document folder its done but the problem is when image is so big resolution then how to set fixed width and height image here is my code please help me
NSInteger N=100000000;
NSUInteger r = arc4random_uniform(N) + 1;
NSString *myimageName=[NSString stringWithFormat:#"image%lu.png",(unsigned long)r];
NSData *pngData = UIImagePNGRepresentation(self.myImage.image ) ;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory
NSString *filePath = [documentsPath stringByAppendingPathComponent:myimageName]; //Add the file name
[pngData writeToFile:filePath atomically:YES];
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
At first resize your image, after save you result.
Hopefully it will be a duplicate question, but I am really stuck in writing into png data chunks. Can someone help me how to get headers from data, in objective-C? Here is the information Section 4.7.2 , c part, Textual Information.
Thanks
I think there is not need to care about png data chunks, just have a try with the code below:
//load target image, then draw text on the target image
UIImage * image = [UIImage imageNamed:#"targetImage.png"];
NSString * text = #"Watermark!";
CGPoint startPoint = CGPointMake(20, 30);
UIFont *font = [UIFont boldSystemFontOfSize:12];
UIGraphicsBeginImageContext(image.size);
[image drawInRect:CGRectMake(0,0,image.size.width,image.size.height)];
CGRect rect = CGRectMake(startPoint.x, startPoint.y, image.size.width, image.size.height);
[[UIColor whiteColor] set];
[text drawInRect:CGRectIntegral(rect) withFont:font];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Convert UIImage object into NSData (a wrapper for a stream of bytes) formatted according to PNG spec
NSData *imageData = UIImagePNGRepresentation(newImage);
//Save imageDate to disk as a png file
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
NSString *imageSubdirectory = [documentsDirectory stringByAppendingPathComponent:#"watermarkFolderName"];
NSString *filePath = [imageSubdirectory stringByAppendingPathComponent:#"watermark.png"];
[imageData writeToFile:filePath atomically:YES];
i'm trying to figure out this problem but failed after doing every thing i found on OS or google. Problem is that when i convert UIImage to NSData using UIImageJPEGRepresentation or UIImagePNGRepresentation it increases the memory size to 30Mb (believe me or not).
Here is my code
myImage= image;
LoginSinglton*loginObj = [LoginSinglton sharedInstance];
NSError *error;
NSData *pngData = UIImageJPEGRepresentation(image, scaleValue); //scaleVale is 1.
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory
self.imageCurrentDateAndTime =[self getTimeAndDate];
self.filePathAndDirectory = [documentsPath stringByAppendingPathComponent:#"Photos Dir"];
NSLog(#"Documents path %#",self.filePathAndDirectory);
if (![[NSFileManager defaultManager] createDirectoryAtPath:self.filePathAndDirectory
withIntermediateDirectories:NO
attributes:nil
error:&error])
{
NSLog(#"Create directory error: %#", error);
}
self.imageName= [NSString stringWithFormat:#"photo-%#-%#.jpg",loginObj.userWebID,self.imageCurrentDateAndTime];
NSString *filePath = [self.filePathAndDirectory stringByAppendingPathComponent:self.imageName];
[pngData writeToFile:filePath atomically:YES]; //Write the file
[self writeImageThumbnailToFolder:image];
[self writeImageHomeViewThumbnailToFolder:image];
I have tried following solution as well
UIImageJPEGRepresentation - memory release issue
1- Used #autoreleasepool
2- done pngData = nil;
but still facing that memory issue.
EDIT I think i'm not able to convey my problem. It's ok if UIImageJPEGRepresentation taking huge memory,but memory should back to it's earlier position after saving that image. Hope this will help you in detail.
Use a scaleValue of less than 1. Even 0.9 will massively reduce the memory footprint with minimal quality loss.
Try this:
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
If not got solution for your expectation, no need to worry try this too:
UIImage *small = [UIImage imageWithCGImage:original.CGImage scale:0.25 orientation:original.imageOrientation];
Get sample app form here for editing image or scale:
http://mattgemmell.com/2010/07/05/mgimageutilities/
To resize using minimum memory try Using CoreGraphics
SO answer by #Mina Nabil, see full answer for more details
#import <ImageIO/ImageIO.h>
-(UIImage*) resizedImageToRect:(CGRect) thumbRect {
CGImageRef imageRef = [inImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
if (alphaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
// Build a bitmap context that's the size of the thumbRect
CGContextRef bitmap = CGBitmapContextCreate(
NULL,
thumbRect.size.width, // width
thumbRect.size.height, // height
CGImageGetBitsPerComponent(imageRef), // really needs to always be 8
4 * thumbRect.size.width, // rowbytes
CGImageGetColorSpace(imageRef),
alphaInfo
);
// Draw into the context, this scales the image
CGContextDrawImage(bitmap, thumbRect, imageRef);
// Get an image from the context and a UIImage
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap); // ok if NULL
CGImageRelease(ref);
return result;
}
I'm trying to resize an image on a background thread and the app always crashes after a few low memory warnings. How can I rewrite the code below to fix this?
float max = 1024*1024;
NSData *pngData = UIImagePNGRepresentation(setImage);
while ([pngData length] > max) {
pngData = nil;
CGSize newSize = CGSizeMake(setImage.size.width*.9, setImage.size.height *.9);
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"scale: %f", (1024.0*1024.0)/((float)[pngData length]));
pngData = UIImagePNGRepresentation(image);
}
NSLog(#"image length: %i",[pngData length]);
[pngData writeToFile:imageLocation atomically:YES];
I have already tried doing this by calculating the scale and replace the .9 in the code with a scale value
float scale = (max)/((float)[pngData length]);
CGSize newSize = CGSizeMake(setImage.size.width*scale, setImage.size.height *scale);
This made the image too small.
The end goal is to take an image from the camera and save it to disk. I originally had to resize the image because I was getting a "Low Memory warning" when loading the image.
Your code causes an infinite loop and creates images until you run out of memory. Try something like this to fix the infinite loop:
float max = 1024*1024;
NSData *pngData = UIImagePNGRepresentation(setImage);
CGSize newSize = setImage.size;
while ([pngData length] > max) {
newSize = CGSizeMake(newSize.width * 0.9, newSize.height * 0.9);
UIGraphicsBeginImageContext(newSize);
[setImage drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"scale: %f", (1024.0*1024.0)/((float)[pngData length]));
pngData = UIImagePNGRepresentation(image);
image = nil;
}
NSLog(#"image length: %i",[pngData length]);
[pngData writeToFile:imageLocation atomically:YES];
Is there any reason why you need to do this by hand? If you use a UIImageView and set the image either with initWithImage: or with the image property and then change UIImageView's frame the image will resize accordingly